DataSphereStudio is a one stop data application development& management portal, covering scenarios including data exchange, desensitization/cleansing, analysis/mining, quality measurement, visualization, and task scheduling.

Overview

DSS

License

English | 中文

Introduction

DataSphere Studio (DSS for short) is WeDataSphere, a big data platform of WeBank, a self-developed one-stop data application development management portal.

Based on Linkis computation middleware, DSS can easily integrate upper-level data application systems, making data application development simple and easy to use.

DataSphere Studio is positioned as a data application development portal, and the closed loop covers the entire process of data application development. With a unified UI, the workflow-like graphical drag-and-drop development experience meets the entire lifecycle of data application development from data import, desensitization cleaning, data analysis, data mining, quality inspection, visualization, scheduling to data output applications, etc.

With the connection, reusability, and simplification capabilities of Linkis, DSS is born with financial-grade capabilities of high concurrency, high availability, multi-tenant isolation, and resource management.

UI preview

Please be patient, it will take some time to load gif.

DSS-V1.0 GIF

Core features

1. One-stop, full-process application development management UI

       DSS is highly integrated. Currently integrated systems include:

       a. Scriptis - Data Development IDE Tool.

       b. Visualis - Data Visualization Tool(Based on the open source project Davinci contributed by CreditEase)

       c. Qualitis - Data Quality Management Tool

       d. Azkaban - Batch workflow job scheduler

DSS one-stop video

2. AppJoint, based on Linkis,defines a unique design concept

       AppJoint——application joint, defining unified front-end and back-end integration specifications, can quickly and easily integrate with external data application systems, making them as part of DSS data application development.

       DSS arranges multiple AppJoints in series to form a workflow that supports real-time execution and scheduled execution. Users can complete the entire process development of data applications with simple drag and drop operations.

       Since AppJoint is integrated with Linkis, the external data application system shares the capabilities of resource management, concurrent limiting, and high performance. AppJoint also allows sharable context across system level and completely gets away from application silos.

3. Project, as the management unit

       With Project as the management unit, DSS organizes and manages the business applications of each data application system, and defines a set of common standards for collaborative development of projects across data application systems.

4. Integrated data application components

      a. Azkaban AppJoint —— Batch workflow job scheduler

         Many data applications developed by users usually require periodic scheduling capability.

         At present, the open source scheduling system in the community is pretty unfriendly to integrate with other data application systems.

         DSS implements Azkaban AppJoint, which allows users to publish DSS workflows to Azkaban for regular scheduling.

         DSS also defines standard and generic workflow parsing and publishing specifications for scheduling systems, allowing other scheduling systems to easily achieve low-cost integration with DSS.

Azkaban

      b. Scriptis AppJoint —— Data Development IDE Tool

         What is Scriptis?

         Scriptis is for interactive data analysis with script development(SQL, Pyspark, HiveQL), task submission(Spark, Hive), UDF, function, resource management and intelligent diagnosis.

         Scriptis AppJoint integrates the data development capabilities of Scriptis to DSS, and allows various script types of Scriptis to serve as nodes in the DSS workflow to participate in the application development process.

         Currently supports HiveSQL, SparkSQL, Pyspark, Scala and other script node types.

Scriptis

      c. Visualis AppJoint —— Data Visualization Tool

         What is Visualis?

         Visualis is a BI tool for data visualization. It provides financial-grade data visualization capabilities on the basis of data security and permissions, based on the open source project Davinci contributed by CreditEase.

         Visualis AppJoint integrates data visualization capabilities to DSS, and allows displays and dashboards, as nodes of DSS workflows, to be associated with upstream data market.

Visualis

      d. Qualitis AppJoint —— Data quality management Tool

         Qualitis AppJoint integrates data quality verification capabilities for DSS, allows Qualitis as a node in DSS workflow

Qualitis

      e. Data Sender——Sender AppJoint

         Sender AppJoint provides data delivery capability for DSS. Currently it supports the SendEmail node type, and the result sets of all other nodes can be sent via email.

         For example, the SendEmail node can directly send the screen shot of a display as an email.

      f. Signal AppJoint —— Signal Nodes

         Signal AppJoint is used to strengthen the correlation between business and process while keeping them decoupled.

         DataChecker Node:Checks whether a table or partition exists.

         EventSender Node: Messaging nodes across workflows and projects.

         EventReceiver: Receive nodes for messages across workflows and projects.

      g. Function node

         Empty nodes, sub workflow nodes.

Compared with similar systems

      DSS is an open source project leading the direction of data application development and management. The open source community currently does not have similar products.

Usage Scenarios

      DataSphere Studio is suitable for the following scenarios:

      1. Scenarios in which big data platform capability is being prepared or initialized but no data application tools are available.

      2. Scenarios in which users already have big data foundation platform capabilities but with only a few data application tools.

      3. Scenarios in which users have the ability of big data foundation platform and comprehensive data application tools, but suffers strong isolation and and high learning costs because those tools have not been integrated together.

      4. Scenarios in which users have the capabilities of big data foundation platform and comprehensive data application tools. but lacks unified and standardized specifications, while a part of these tools have been integrated.

Quick start

Click to Quick start

Architecture

DSS Architecture

Documents

Compiled documentation

User manual

Quick integration with DSS for external systems

Communication

communication

License

DSS is under the Apache 2.0 license. See the License file for details.

Comments
  • Failed to create projects

    Failed to create projects

    The error message is:

    error code(错误码): 90002, error message(错误信息): add scheduler project failederrCode: 90019 ,desc: Connection reset ,ip: DataSphere-web1 ,port: 9004 ,serviceKind: dss-server.

    Checked dss server log, found the following error messages:

    2019-12-26 11:11:32.099 [ERROR] [qtp585718112-25                         ] c.w.w.d.a.s.a.s.AzkabanSecurityService (80) [login] - 获取session失败: java.net.SocketException: Connection reset
            at java.net.SocketInputStream.read(SocketInputStream.java:210) ~[?:1.8.0_232]
            at java.net.SocketInputStream.read(SocketInputStream.java:141) ~[?:1.8.0_232]
            at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137) ~[httpcore-4.4.7.jar:4.4.7]
            at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153) ~[httpcore-4.4.7.jar:4.4.7]
            at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:282) ~[httpcore-4.4.7.jar:4.4.7]
            at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138) ~[httpclient-4.5.4.jar:4.5.4]
            at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56) ~[httpclient-4.5.4.jar:4.5.4]
            at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259) ~[httpcore-4.4.7.jar:4.4.7]
    
    2019-12-26 11:11:32.106 [ERROR] [qtp585718112-25                         ] c.w.w.d.s.s.i.DWSProjectServiceImpl (166) [createSchedulerProject] - add scheduler project failed, com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException: errCode: 90019 ,desc: Connection reset ,ip: DataSphere-web1 ,port: 9004 ,serviceKind: dss-server
            at com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.service.AzkabanSecurityService.login(AzkabanSecurityService.java:81) ~[?:?]
            at com.webank.wedatasphere.dss.server.function.FunctionInvoker.projectServiceAddFunction(FunctionInvoker.java:76) ~[dss-server-0.6.0.jar:?]
            at com.webank.wedatasphere.dss.server.service.impl.DWSProjectServiceImpl.createSchedulerProject(DWSProjectServiceImpl.java:161) [dss-server-0.6.0.jar:?]
            at com.webank.wedatasphere.dss.server.service.impl.DWSProjectServiceImpl.addProject(DWSProjectServiceImpl.java:109) [dss-server-0.6.0.jar:?]
            at com.webank.wedatasphere.dss.server.service.impl.DWSProjectServiceImpl$$FastClassBySpringCGLIB$$fe55cc96.invoke(<generated>) [dss-server-0.6.0.jar:?]
            at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204) [spring-core-5.0.7.RELEASE.jar:5.0.7.RELEASE]
            at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:746) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
    

    Azkaban is running.

    tcp        0      0 0.0.0.0:12321           0.0.0.0:*               LISTEN      4789/java
    tcp        0      0 0.0.0.0:5005            0.0.0.0:*               LISTEN      4789/java
    tcp        0      0 0.0.0.0:8081            0.0.0.0:*               LISTEN      4789/java
    
    opened by RainFlying 10
  • DSS-0.9.0 community communication

    DSS-0.9.0 community communication

    DSS0.9.0 is an important version completed by Ctyun with the help of Webank. This version aims to build a one-stop big data development and service platform by introducing the concepts of "DSS Integration Standard" and "workspace", and greatly improve user's experience of big data development.

    Introduction to the concept:

    DSS Integration Standard : The unified access specification for application system composed of SecurityService is extracted from the AppJoint specification of DSS, so that enterprises can easily connect relevant big data products to DSS and quickly acquire the capability of one-stop big data application store. Workspace : Enables different types of data applications (such as: offline workflow application, real-time application, data API service, etc.) to be organized and managed from a unified perspective to improve application development and management specifications;

    The specific development content is as follows:

    • Home page reconstruction, create workspace, case experience area, quick start and other areas, to provide new users with better use guidance;
    • New workspace page, in which users can conduct all kinds of big data development projects in a one-stop manner: • Common function sub-module, users can customize common functions for easy access; • Application store sub-module, users can easily browse and search all big data-related development tools; • Administrator function sub-module, the workspace administrator can set and manage the workspace;
    • And some known bug fixes;

    DSS-0.9.0版本社区交流

    DSS0.9.0版本是在微众银行的倾力帮助下,由天翼云主导完成的一个重要版本。 该版本旨在通过引入“DSS应用接入规范”,“工作空间”等概念,构建一站式大数据开发与服务平台,极大提升用户大数据开发的体验,其中:

    概念简介:

    DSS应用接入规范: 是从DSS的AppJoint规范中,抽出SecurityService组成的应用系统统一接入规范,使企业可以非常简单地将相关大数据产品接入到DSS之中,快速具备一站式的大数据应用系统门户展示和管理能力(DSS应用接入规范将会在DSS的后续版本之中继续补充完善)。 工作空间规范: 使不同类型的数据应用(如:工作流离线应用、实时应用、数据API服务等)能够以统一的视角进行组织管理,提升应用的开发和管理规范;

    具体开发内容如下:

    • 首页重构,新增工作空间功能、案例体验区、快速入门等区域,为新手用户提供更好的使用指导; • 新增工作空间页面,在工作空间内,用户可以一站式地进行各类大数据开发项目: • 常用功能子模块,用户可以自定义常用功能以方便地进入; • 应用商店子模块,用户可以方便地浏览,搜索所有的大数据相关开发工具; • 管理员功能子模块,工作空间管理员可以对工作空间进行设置与管理; • 以及一些已知bug的修复;

    opened by Adamyuanyuan 8
  • [Feature] dss访问qualitis和schedulis组件能够实现免密登录

    [Feature] dss访问qualitis和schedulis组件能够实现免密登录

    Search before asking

    • [X] I had searched in the issues and found no similar feature requirement.

    Problem Description

    每次从dss访问qualitis和schedulis都需要进行一次登录,对于多个用户就非常麻烦,没有实现用户隔离情况

    Description

    No response

    Use case

    No response

    solutions

    No response

    Anything else

    No response

    Are you willing to submit a PR?

    • [X] Yes I am willing to submit a PR!
    enhancement 
    opened by xyh15864643181 5
  • 启动出现问题

    启动出现问题

    简单版 安装Linkis(0.9.2)算是正常吧,除了JDBC服务没有启动,其它服务正常启动; 简单版 安装dss(0.6.0)没有明显的错误日志,不知道怎么解决 <--------------------------------> Begin to start dss-server INFO: + End to start dss-server <--------------------------------> <--------------------------------> Begin to start dss-flow-execution-entrance INFO: + End to start dss-flow-execution-entrance <--------------------------------> <--------------------------------> Begin to start linkis-appjoint-entrance INFO: + End to start linkis-appjoint-entrance <--------------------------------> <--------------------------------> Begin to start visualis-server INFO: + End to start visualis-server <-------------------------------->

    opened by GJMZ 4
  • Failed to create workflow in a project

    Failed to create workflow in a project

    The error message was:

    " operation failed(操作失败)s!the reason(原因):HttpClientResultException: errCode: 10905 ,desc: URL http://127.0.0.1:9001/api/rest_j/v1/bml/upload request failed! ResponseBody is {"method":null,"status":1,"message":"error code(错误码): 50073, error message(错误信息): 提交上传资源任务失败:\n### Error updating database. Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'linkis.linkis_resources_task' doesn't exist\n### The error may involve com.webank.wedatasphere.linkis.bml.dao.TaskDao.insert-Inline\n### The error occurred while setting parameters\n### SQL: INSERT INTO linkis_resources_task( resource_id,version,operation,state, submit_user,system,instance, client_ip,err_msg,start_time,end_time,last_update_time, extra_params ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\n### Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'linkis.linkis_resources_task' doesn't exist\n; bad SQL grammar []; nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'linkis.linkis_resources_task' doesn't exist.","data":{"errorMsg":{"serviceKind":"bml-server","level":2,"port":9999,"errCode":50073,"ip":"DataSphere-web1","desc":"提交上传资源任务失败:\n### Error updating database. Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'linkis.linkis_resources_task' doesn't exist\n### The error may involve com.webank.wedatasphere.linkis.bml.dao.TaskDao.insert-Inline\n### The error occurred while setting parameters\n### SQL: INSERT INTO linkis_resources_task( resource_id,version,operation,state, submit_user,system,instance, client_ip,err_msg,start_time,end_time,last_update_time, extra_params ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\n### Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'linkis.linkis_resources_task' doesn't exist\n; bad SQL grammar []; nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'linkis.linkis_resources_task' doesn't exist"}}}. ,ip: DataSphere-web1 ,port: 9004 ,serviceKind: dss-server "

    According to the page https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch1/DSS%E5%AE%89%E8%A3%85%E5%B8%B8%E8%A7%81%E9%97%AE%E9%A2%98%E5%88%97%E8%A1%A8.md#6-dss%E5%88%9B%E5%BB%BA%E5%B7%A5%E7%A8%8B%E6%8A%A5%E8%A1%A8linkislinkis_resources_task%E4%B8%8D%E5%AD%98%E5%9C%A8, I needed to run db/moudle/linkis-bml.sql manually.

    Tried this SQL file, found only linkis_resources_task table was missing, ran the query manually on the mysql session. The above error was gone, found a new one.

    " operation failed(操作失败)s!the reason(原因):HttpClientResultException: errCode: 10905 ,desc: URL http://127.0.0.1:9001/api/rest_j/v1/bml/upload request failed! ResponseBody is {"method":null,"status":1,"message":"error code(错误码): 50073, error message(错误信息): 提交上传资源任务失败:\n### Error updating database. Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'system' cannot be null\n### The error may involve com.webank.wedatasphere.linkis.bml.dao.TaskDao.insert-Inline\n### The error occurred while setting parameters\n### SQL: INSERT INTO linkis_resources_task( resource_id,version,operation,state, submit_user,system,instance, client_ip,err_msg,start_time,end_time,last_update_time, extra_params ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\n### Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'system' cannot be null\n; SQL []; Column 'system' cannot be null; nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'system' cannot be null.","data":{"errorMsg":{"serviceKind":"bml-server","level":2,"port":9999,"errCode":50073,"ip":"DataSphere-web1","desc":"提交上传资源任务失败:\n### Error updating database. Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'system' cannot be null\n### The error may involve com.webank.wedatasphere.linkis.bml.dao.TaskDao.insert-Inline\n### The error occurred while setting parameters\n### SQL: INSERT INTO linkis_resources_task( resource_id,version,operation,state, submit_user,system,instance, client_ip,err_msg,start_time,end_time,last_update_time, extra_params ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\n### Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'system' cannot be null\n; SQL []; Column 'system' cannot be null; nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'system' cannot be null"}}}. ,ip: DataSphere-web1 ,port: 9004 ,serviceKind: dss-server "

    mysql> show create table linkis_resources_task \G
    *************************** 1. row ***************************
           Table: linkis_resources_task
    Create Table: CREATE TABLE `linkis_resources_task` (
      `id` bigint(20) NOT NULL AUTO_INCREMENT,
      `resource_id` varchar(50) DEFAULT NULL COMMENT '资源id,资源的uuid',
      `version` varchar(20) DEFAULT NULL COMMENT '当前操作的资源版本号',
      `operation` varchar(20) NOT NULL COMMENT '操作类型.upload = 0, update = 1',
      `state` varchar(20) NOT NULL DEFAULT 'Schduled' COMMENT '任务当前状态:Schduled, Running, Succeed, Failed,Cancelled',
      `submit_user` varchar(20) NOT NULL DEFAULT '' COMMENT '任务提交用户名',
      `system` varchar(20) NOT NULL DEFAULT '' COMMENT '子系统名 wtss',
      `instance` varchar(50) NOT NULL COMMENT '物料库实例',
      `client_ip` varchar(50) DEFAULT NULL COMMENT '请求IP',
      `extra_params` text COMMENT '额外关键信息.如批量删除的资源IDs及versions,删除资源下的所有versions',
      `err_msg` varchar(2000) DEFAULT NULL COMMENT '任务失败信息.e.getMessage',
      `start_time` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '开始时间',
      `end_time` datetime DEFAULT NULL COMMENT '结束时间',
      `last_update_time` datetime NOT NULL COMMENT '最后更新时间',
      PRIMARY KEY (`id`),
      UNIQUE KEY `resource_id_version` (`resource_id`,`version`,`operation`)
    ) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4
    1 row in set (0.00 sec)
    
    opened by RainFlying 4
  • [Bug] java.lang.ClassNotFoundException: org.apache.linkis.common.utils.Logging$class

    [Bug] java.lang.ClassNotFoundException: org.apache.linkis.common.utils.Logging$class

    Search before asking

    • [X] I searched the issues and found no similar issues.

    DSS Component

    dss-commons, dss-appconn, dss-framework, dss-orchestrator, dss-standard, dss-plugins, dss-apps/dss-apiservice, dss-web/dss-scriptis, dss-web/dss-workflow, dss-web/workspace, dss-web/dss-apiservice, dss-web/framework

    What happened + What you expected to happen

    我编译了linkis 1.1.1.然后编译了dss.启动的时候报错 Caused by: java.lang.ClassNotFoundException: org.apache.linkis.common.utils.Logging$class at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[?:1.8.0_221] at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_221] at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) ~[?:1.8.0_221] at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_221] at org.apache.linkis.rpc.RPCReceiveRestful.(RPCReceiveRestful.scala:37) ~[linkis-rpc-1.1.1.jar:1.1.1] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_221] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_221] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_221] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_221] at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:204) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:87) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:1315) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE] ... 18 more 在dss中org.apache.linkis.common.utils.Logging是一个抽象类,编译后不会有Logging$class这个文件

    Relevent platform

    dss1.1.1编译后无法启动

    Reproduction script

    sbin/dss-daemon.sh start dss-apiservice-serve

    Anything else

    No response

    Are you willing to submit a PR?

    • [X] Yes I am willing to submit a PR!
    bug 
    opened by liaotian1005 3
  • Bump fastjson from 1.2.70 to 1.2.83 in /dss-apps/dss-datapipe-server

    Bump fastjson from 1.2.70 to 1.2.83 in /dss-apps/dss-datapipe-server

    Bumps fastjson from 1.2.70 to 1.2.83.

    Release notes

    Sourced from fastjson's releases.

    FASTJSON 1.2.83版本发布(安全修复)

    这是一个安全修复版本,修复最近收到在特定场景下可以绕过autoType关闭限制的漏洞,建议fastjson用户尽快采取安全措施保障系统安全。

    安全修复方案https://github.com/alibaba/fastjson/wiki/security_update_20220523

    Issues

    1. 安全加固
    2. 修复JDK17下setAccessible报错的问题 #4077

    fastjson 1.2.79版本发布,BUG修复

    这又是一个bug fixed的版本,大家按需升级

    Issues

    1. 修复引入MethodInheritanceComparator导致某些场景序列化报错的问题
    2. 增强JDK 9兼容
    3. 修复JSONArray/JSONObject的equals方法在内部对象map/list相同时不直接返回true的问题

    相关链接

    fastjson 1.2.76版本发布,BUG修复增强兼容

    这又是一个bug fixed的版本,大家按需升级

    Issues

    1. 修复一些直接抛RuntimeException的问题 #3631
    2. parser自动识别gzip bytes #3614
    3. 修复Throwable继承类属性不支持自动类型转换问题 #3217
    4. 修复PrettyFormat情况下引用计算不对的问题 #3672
    5. 修复AutoType不兼容LinkedHashMap的问题
    6. 增强对Enum类型的自定类型转换
    7. 修复deserializeUsing在泛型某些场景不能正常工作的问题 #3693
    8. 提升JSONReader性能,减少小对象创建 #3627
    9. 增强对JSONPath对filter的支持 #3629
    10. JSONPath支持忽略NullValue的选项 #3607
    11. 增强对定制化enum的支持 #3601
    12. 增强对java.time.Instant和org.joda.time.Instant的支持 #3539
    13. 修复Parser某些场景不能识别引用的问题

    相关链接

    fastjson 1.2.75版本发布,例行Bug修复

    ... (truncated)

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies java 
    opened by dependabot[bot] 3
  • [Bug] (v1.0.1) {

    [Bug] (v1.0.1) {"status":404,"error":"Not Found","message":"","path":"/api/rest_j/v1/dss/framework/workspace/workspaces"}

    Search before asking

    • [X] I searched the issues and found no similar issues.

    DSS Component

    dss-apps/dss-apiservice

    What happened + What you expected to happen

    我使用的是全家桶1.0.1版本,一键安装,启动成功,访问登录页面成功,登录成功后跳转到页面,报错接口404,这个bug是我安装的问题,还是软件本身的问题?

    {"timestamp":1648986200967,"status":404,"error":"Not Found","message":"","path":"/api/rest_j/v1/dss/framework/workspace/workspaces"}

    {"timestamp":1648986200972,"status":404,"error":"Not Found","message":"","path":"/api/rest_j/v1/dss/framework/workspace/workspaces/videos"}

    {"timestamp":1648986200995,"status":404,"error":"Not Found","message":"","path":"/api/rest_j/v1/dss/framework/workspace/getBaseInfo"}

    {"timestamp":1648986200961,"status":404,"error":"Not Found","message":"","path":"/api/rest_j/v1/dss/framework/workspace/workspaces/departments"}

    Relevent platform

    centos7

    Reproduction script

    null

    Anything else

    No response

    Are you willing to submit a PR?

    • [X] Yes I am willing to submit a PR!
    bug 
    opened by gongzh021 3
  • dss1.0.1 + linkis1.0.3 部署脚本错误,linkis安装失败

    dss1.0.1 + linkis1.0.3 部署脚本错误,linkis安装失败

    Search before asking

    • [X] I searched the issues and found no similar issues.

    DSS Component

    dss-standard

    What happened + What you expected to happen

    You chose not execute table-building statements create hdfs directory and local directory Succeed to + create file:///tmp/linkis/ directory Succeed to + create hdfs:///tmp/linkis directory Succeed to + create hdfs:///tmp/linkis directory create dir LINKIS_HOME: /home/hdfs/DSS-Linkis/linkis-pre-install/LinkisInstall Succeed to + Create the dir of /home/hdfs/DSS-Linkis/linkis-pre-install/LinkisInstall Start to cp /home/hdfs/DSS-Linkis/linkis-pre-install/linkis-package to /home/hdfs/DSS-Linkis/linkis-pre-install/LinkisInstall. Succeed to + cp /home/hdfs/DSS-Linkis/linkis-pre-install/linkis-package to /home/hdfs/DSS-Linkis/linkis-pre-install/LinkisInstall Update config... update conf /home/hdfs/DSS-Linkis/linkis-pre-install/LinkisInstall/conf/linkis.properties update conf /home/hdfs/DSS-Linkis/linkis-pre-install/LinkisInstall/conf/linkis-mg-gateway.properties update conf /home/hdfs/DSS-Linkis/linkis-pre-install/LinkisInstall/conf/linkis-ps-publicservice.properties Congratulations! You have installed Linkis 1.0.3 successfully, please use sh /home/hdfs/DSS-Linkis/linkis-pre-install/LinkisInstall/sbin/linkis-start-all.sh to start it! Your default account password ishdfs/18bdaf152 Succeed to + install Linkis

    Relevent platform

    centos7

    Reproduction script

    deploy user

    deployUser=hdfs

    HIVE_META_URL=jdbc:mysql://hadoop01:3306/hive?characterEncoding=UTF-8 HIVE_META_USER=hive HIVE_META_PASSWORD=hive

    ###HADOOP CONF DIR #/appcom/config/hadoop-config HADOOP_CONF_DIR=/etc/hadoop/3.1.5.0-152/0/ ###HIVE CONF DIR #/appcom/config/hive-config HIVE_CONF_DIR=/etc/hive/3.1.5.0-152/0/ ###SPARK CONF DIR #/appcom/config/spark-config SPARK_CONF_DIR=/etc/spark2/3.1.5.0-152/0/

    Engine version conf

    #SPARK_VERSION SPARK_VERSION=2.3.0 ##HIVE_VERSION HIVE_VERSION=3.1.0 #PYTHON_VERSION=python2

    Anything else

    No response

    Are you willing to submit a PR?

    • [ ] Yes I am willing to submit a PR!
    bug 
    opened by qiaobu 3
  • schedulis:pom:0.6.1 could not find

    schedulis:pom:0.6.1 could not find

    maven clean package Could not find artifact com.webank.wedatasphere.schedulis:schedulis:pom:0.6.1

    the artifact com.webank.wedatasphere.schedulis:azkaban-common:pom:0.6.1 's parent

    opened by jackxu2011 3
  • 工作流保存后再打开没有任何内容

    工作流保存后再打开没有任何内容

    DSS0.9.1, linkis0.9.4 基于HDP3.1.4( hadoop 3.1 , hive 3.0 , spark2.3) 编译,可以在scriptis里写sparksql脚本及执行。 创建工作流并在上面拖拽了sparksql 控件,写了SQL脚本后,保存工作流。关闭后再打开,工作流显示空白页。

    要怎么去调试这个问题呢?

    opened by lordk911 3
  • [Feature] compatiable with linkis 1.3.1

    [Feature] compatiable with linkis 1.3.1

    Search before asking

    • [X] I had searched in the issues and found no similar feature requirement.

    Problem Description

    1. dss-flow-execution-server miss linkis-scheduler scheduler.
            <dependency>
                <groupId>org.apache.linkis</groupId>
                <artifactId>linkis-scheduler</artifactId>
                <version>${linkis.version}</version>
            </dependency>
    
    1. dss-orchestrator/orchestrators/dss-workflow/dss-flow-execution-server/src/main/java/com/webank/wedatasphere/dss/flow/execution/entrance/persistence/WorkflowPersistenceEngine.java should remove the subJob related methods
    @Override
        public void persist(SubJobInfo subjobInfo) throws ErrorException {
    
        }
     @Override
        public void updateIfNeeded(SubJobInfo subJobInfo) throws ErrorException {
    
        }
    
        @Override
        public SubJobDetail retrieveJobDetailReq(Long jobDetailId) throws ErrorException {
            return null;
        }
    
    

    Description

    no

    Use case

    no

    solutions

    no

    Anything else

    no

    Are you willing to submit a PR?

    • [ ] Yes I am willing to submit a PR!
    enhancement 
    opened by jackxu2011 0
  • Bump spring-beans from 5.2.12.RELEASE to 5.2.20.RELEASE in /dss-appconn/appconns/dss-dolphinscheduler-appconn

    Bump spring-beans from 5.2.12.RELEASE to 5.2.20.RELEASE in /dss-appconn/appconns/dss-dolphinscheduler-appconn

    Bumps spring-beans from 5.2.12.RELEASE to 5.2.20.RELEASE.

    Release notes

    Sourced from spring-beans's releases.

    v5.2.20.RELEASE

    :star: New Features

    • Restrict access to property paths on Class references #28262
    • Improve diagnostics in SpEL for large array creation #28257

    v5.2.19.RELEASE

    :star: New Features

    • Declare serialVersionUID on DefaultAopProxyFactory #27785
    • Use ByteArrayDecoder in DefaultClientResponse::createException #27667

    :lady_beetle: Bug Fixes

    • ProxyFactoryBean getObject called before setInterceptorNames, silently creating an invalid proxy [SPR-7582] #27817
    • Possible NPE in Spring MVC LogFormatUtils #27783
    • UndertowHeadersAdapter's remove() method violates Map contract #27593
    • Fix assertion failure messages in DefaultDataBuffer.checkIndex() #27577

    :notebook_with_decorative_cover: Documentation

    • Lazy annotation throws exception if non-required bean does not exist #27660
    • Incorrect Javadoc in [NamedParameter]JdbcOperations.queryForObject methods regarding exceptions #27581
    • DefaultResponseErrorHandler update javadoc comment #27571

    :hammer: Dependency Upgrades

    • Upgrade to Reactor Dysprosium-SR25 #27635
    • Upgrade to Log4j2 2.16.0 #27825

    v5.2.18.RELEASE

    :star: New Features

    • Enhance DefaultResponseErrorHandler to allow logging complete error response body #27558
    • DefaultMessageListenerContainer does not log an error/warning when consumer tasks have been rejected #27457

    :lady_beetle: Bug Fixes

    • Performance impact of con.getContentLengthLong() in AbstractFileResolvingResource.isReadable() downloading huge jars to check component length #27549
    • Performance impact of ResourceUrlEncodingFilter on HttpServletResponse#encodeURL #27548
    • Avoid duplicate JCacheOperationSource bean registration in #27547
    • Non-escaped closing curly brace in RegEx results in initialization error on Android #27502
    • Proxy generation with Java 17 fails with "Cannot invoke "Object.getClass()" because "cause" is null" #27498
    • ConcurrentReferenceHashMap's entrySet violates the Map contract #27455

    :hammer: Dependency Upgrades

    • Upgrade to Reactor Dysprosium-SR24 #27526

    v5.2.17.RELEASE

    ... (truncated)

    Commits
    • cfa701b Release v5.2.20.RELEASE
    • 996f701 Refine PropertyDescriptor filtering
    • 90cfde9 Improve diagnostics in SpEL for large array creation
    • 94f52bc Upgrade to Artifactory Resource 0.0.17
    • d4478ba Upgrade Java versions in CI image
    • 136e6db Upgrade Ubuntu version in CI images
    • 8f1f683 Upgrade Java versions in CI image
    • ce2367a Upgrade to Log4j2 2.17.1
    • acf7823 Next development version (v5.2.20.BUILD-SNAPSHOT)
    • 1a03ffe Upgrade to Log4j2 2.16.0
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies java 
    opened by dependabot[bot] 0
  • [Bug] You are not logged in, please login first(您尚未登录) dss1.1.1 跳转exchangis1.0.0

    [Bug] You are not logged in, please login first(您尚未登录) dss1.1.1 跳转exchangis1.0.0

    Search before asking

    • [X] I searched the issues and found no similar issues.

    DSS Component

    dss-appconn

    What happened + What you expected to happen

    通过dss跳转exchangis发生 未登录问题

    image

    Relevent platform

    linkis 1.2.0 dss 1.1.1 exchangis 1.0.0

    Reproduction script

    image 按照上述方法尝试还是有问题

    Anything else

    单独打开exchangis前端是可以获取cookie

    Are you willing to submit a PR?

    • [X] Yes I am willing to submit a PR!
    bug 
    opened by Yiutto 0
  • [Feature] 工作流动态参数

    [Feature] 工作流动态参数

    Search before asking

    • [X] I had searched in the issues and found no similar feature requirement.

    Problem Description

    与dolphin工作流参数功能相同,将上游节点计算结果输出到工作流上下文参数中,供下游(脚本/功能)节点使用,可替换默认系统参数。 dolphin使用连接https://blog.csdn.net/godlovedaniel/article/details/124934124 如此可为采集提供动态修改源/目标表名的功能。如采集含默认任务参数sourceTableName、targetTableName,源表为分表结构,源表名即可通过上游节点计算得来,动态修改源表名。

    Description

    dolphin使用连接https://blog.csdn.net/godlovedaniel/article/details/124934124

    Use case

    可为采集提供动态修改源/目标表名的功能。如采集含默认任务参数sourceTableName、targetTableName,源表为分表结构,源表名即可通过上游节点计算得来,动态修改源表名。

    solutions

    dolphin使用连接https://blog.csdn.net/godlovedaniel/article/details/124934124

    Anything else

    No response

    Are you willing to submit a PR?

    • [X] Yes I am willing to submit a PR!
    enhancement 
    opened by hcl3039359 0
  • [Bug] Failed to update project information

    [Bug] Failed to update project information

    Search before asking

    • [X] I searched the issues and found no similar issues.

    DSS Component

    dss-appconn, dss-framework

    What happened + What you expected to happen

    Workspace Management - Configuration - Update Information Failed to update project information after installing exchangis appconn.

    Relevent platform

    DSS 1.1.1+Linkis 1.1.1

    Reproduction script

    Workspace Management - Configuration - Update Information

    Anything else

    No response

    Are you willing to submit a PR?

    • [ ] Yes I am willing to submit a PR!
    bug 
    opened by utopianet 0
  • [Feature] Both Scheduleis and DolphinScheduler dispatching centers are supported

    [Feature] Both Scheduleis and DolphinScheduler dispatching centers are supported

    Search before asking

    • [X] I had searched in the issues and found no similar feature requirement.

    Problem Description

    At present, the scheduling center in the workflow can only support one scheduling tool at the same time, and cannot support both. 目前工作流中的调度中心,仅能同时支持其中一种调度工具的调度中心,无法同时支持二种。

    Description

    Since there is only one parameter of the dispatching center, the dispatching center cannot support two kinds of dispatching at the same time. 由于调度中心参数只有一个,故无法同时支持二种调度的调度中心同时存在。

    It is expected that new parameters can be added to support the dispatching center function of using two types of dispatching at the same time. 期望能新增参数,支持同时使用二种调度的调度中心功能。

    Use case

    No response

    solutions

    No response

    Anything else

    No response

    Are you willing to submit a PR?

    • [ ] Yes I am willing to submit a PR!
    enhancement 
    opened by utopianet 0
Releases(1.1.1)
  • 1.1.1(Dec 1, 2022)

    DataSphereStudio 1.1.1 is an update to version 1.1.0, which mainly involves support for creating workflows without scheduling system integration, fixing the ability to publish scripts as RESTFUL APIs, fixing Schedulis, Qualitis, and DolphinScheduler password-free jump exceptions, fixing Qualitis node publishing to DolphinScheduler exceptions, fixing workspace management user list interface exceptions, and Fix the exception of workflow editing for publishing to DolphinScheduler.

    But you have to note that the 'subFlow' node in DataSphereStudio cannot be published to DolphinScheduler at present.

    DSS-1.1.1 includes all of Project DSS-1.1.1.

    The main features of DSS-1.1.1 are as flollows:

    • Support to create workflows without integrated scheduling system.

    Abbreviations:

    • DSS: DataSphereStudio

    Enhancement

    • DSS-993 [DSS-Workflow] Support to create workflow without integrated scheduling system.

    Bugs Fix

    • DSS-954 [DSS-Workspace] Fix the password-free jump exception of Schedulis and Qualitis.
    • DSS-984 [DSS-Workspace] Fix the password -free jump exception of DolphinScheduler.
    • DSS-967 [DSS-Workspace] Fix the problem of abnormality in the interface of getting workspace management user list.
    • DSS-991 [DSS-Appconn] Fix the exception of publishing the Quality node to DolphinScheduler.
    • DSS-997 [DSS-Appconn] Fix workflow editing exception when it has been published to DolphinScheduler.
    • DSS-995 [DSS-Scriptis] Fix the ability to publish scripts as RESTFUL API.

    Credits

    The release of DSS 1.1.1 is inseparable from the contributors of the DSS community. Thanks to all the community contributors, Including but not limited to the following contributors:ichenfeiyang、jacktao007、LMQ0023、utopianet、coombe。


    Compiled Version

    Tencent Cloud:

    DSS-1.1.1&Linkis-1.1.1 Compiled package (.tar.gz)

    Source code(tar.gz)
    Source code(zip)
  • 1.1.0(Jul 1, 2022)

    DataSphereStudio 1.1.0 is a milestone in the practice of data application development and management framework. It integrates all the open source ecological components of WeDataSphere, and brings a series of powerful new features, as well as a more streamlined and easy-to-connect data application development integration architecture design and implementation .

    DSS-1.1.0 includes all of Project DSS-1.1.0.

    The main features of DSS-1.1.0 are as follows:

    • All the open source ecological components of WeDataSphere have been integrated, including Apache Linkis 1.1.1, Exchangeis 1.0.0, Schedulis 0.7.0, Qualitis 0.9.2, Visualis 1.0.0, Streamis 0.2.0 and Prophecis 0.3.2.
    • Integrate Apache DolphinScheduler 1.3.X. It supports one-click publishing of DSS workflows as DolphinScheduler workflows, designs and develops a new scheduling center for workflow scheduling.
    • User experience optimization. Such as support for skinning, top navigation bar revision, DSS development center revision, etc.
    • Help manual and beginner's guide.
    • Installation and deployment optimization. It further simplifies the one-click installation and deployment process of DSS&Linkis family buckets, allowing the installation of DSS and Linkis to be completed within half an hour.
    • Support graceful upgrade. Provides a detailed upgrade process on how to upgrade from DSS-1.0.1 to DSS-1.1.0, and how to migrate from DSS-0.9 to DSS-1.1.0.
    • AppConn architecture optimization. The architecture is simpler and clearer, and the documentation is more comprehensive and detailed. It teaches you how to implement a new AppConn and add a new workflow node.

    Abbreviations:

    • DSS: DataSphereStudio
    • DAS: DataApiService

    New Feature

    • DSS-848 [DSS-AppConn] Integrate Apache DolphinScheduler1.3.X, design and develop a new scheduling center
    • DSS-849 [DSS-Commons] Adapt to Apache Linkis1.1.1
    • DSS-716 [DSS-AppConn] Integrate Exchangeis1.0.0
    • DSS-846 [DSS-AppConn] Integrate Schedulis0.7.0
    • DSS-847 [DSS-AppConn] Integrate Qualitis0.9.2
    • DSS-850 [DSS-AppConn] Integrate Visualis1.0.0
    • DSS-851 [DSS-AppConn] Integrate Streamis0.2.0
    • DSS-852 [DSS-AppConn] Integrate Prophecis0.3.2
    • DSS-853 [DSS-AppConn] AppConn Architecture Optimization
    • DSS-792 [DSS-Deployment] Support hot update AppConn plugin installation without restarting all services
    • DSS-862 [DSS-Deployment] Support graceful upgrade
    • DSS-863 [DSS-Workspace] Added help manual and beginner's guide
    • DSS-590 [DSS-UI] Optimize development center user experience, such as support changing the background and optimizing the display of the top navigation bar, etc.
    • DSS-864 [DSS-Workflow] Add the script file download function of the workflow version

    Enhancement

    • DSS-865[DSS-Workspace] Streamline part of the interface of the workspace, while removing the useless part of the interface
    • DSS-866 [DSS-Workspace] Add a workspace type option when creating a new workspace; add a management console function on the right side of the workspace homepage; add a help button in the lower right corner
    • DSS-753 [DSS-Workspace] Remove the application store and change the component access entry to the upper left menu bar
    • DSS-728 [DSS-Workspace] Optimize the demo of the application development process at the bottom of the homepage, remove the button of the demo case and change it to stay tuned
    • DSS-590 [DSS-UI] Adjust the style of the explorer button, add the background switching function, change the UI of the project list and navigation bar, change the UI style of the application store
    • DSS-628 [DSS-DAS] The download function is limited to 5000, and every download operation will pop up an operation risk prompt to the user
    • DSS-780 [DSS-DAS] Optimiz result set visualization page display, optimize result set paging sorting, support result set and log part drop-down, support result set table width extension
    • DSS-864 [DSS-Workflow] Add workflow basic property display
    • DSS-724 [DSS-Workflow] Automatically check the workflow mode and workflow mode when creating a new workflow
    • DSS-725 [DSS-Workflow] Add the workflow importance level feature when creating a new workflow
    • DSS-161 [DSS-Workflow] Support the use of runtime variables in datachecker
    • DSS-868 [DSS-Scriptis] Support dragging the code tab to any position
    • DSS-869 [DSS-Scriptis] Support mutual modification of Sql and Hql file types
    • DSS-870 [DSS-Scriptis] Support copy and paste workspace code files
    • DSS-720 [DSS-Scriptis] Support only viewing tables created by yourself
    • DSS-871 [DSS-Commons] Help users clear cache after login
    • DSS-872 [DSS-AppConn] Supplement Update and Delete operations of Schedules AppConn, and add corresponding interfaces

    Bugs Fix

    • DSS-650 [DSS-Workspace] Fixed an issue where all 3rd party system items could not be deleted synchronously when deleting an item
    • DSS-594 [DSS-Workspace] Fix the problem of 400 bad request reported in workspace management-user management, editing user interface
    • DSS-873 [DSS-Workspace] Fixed the problem that the user data displayed abnormally on the user management page
    • DSS-601 [DSS-Workspace] Fixed the issue that the drop-down boxes of publish permission, edit permission and view permission on the Create Project page could not obtain all users of the workspace
    • DSS-632 [DSS-Workspace] Fixed the problem that the file directory displayed in the front-end workspace was incorrect after switching the proxy user
    • DSS-874 [DSS-Workspace] Fixed the problem that the interface did not display the visual interface after clicking the visual button
    • DSS-621 [DSS-Workspace] Fix the problem that the script selects right-click to open to the side console and reports an error
    • DSS-875 [DSS-Workspace] Fix the problem of copying and pasting the script to the first opened folder and reporting an error
    • DSS-744 [DSS-Workspace] Fix the problem that other users have abnormal permissions after a user grants permissions to other users after creating a new project
    • DSS-704 [DSS-Workspace] Fix the problem that the project copy function button is not displayed
    • DSS-667 [DSS-Workspace] Fix the problem that the request to the third-party application to check whether the project name is duplicated fails when creating a project
    • DSS-652 [DSS-Workflow] Fix the problem that the eventsender and eventreceiver nodes fail to pass parameters
    • DSS-640 [DSS-Workflow] Fixed the issue that new built-in parameters such as run_year in datachecker did not take effect
    • DSS-759 [DSS-Workflow] Fix the problem that the exported data is disordered or lost when there is a subflow
    • DSS-695 [DSS-Workflow] Fix the issue that sendmail node fails to run
    • DSS-692 [DSS-Workflow] Fix the failure of workflow rollback in the development center
    • DSS-600 [DSS-Workflow] Fix the Failed to async get EngineNode AMErrorException when datachecker, eventsender, eventreceiver nodes are running
    • DSS-605 [DSS-Workflow] Fix the exception that the workflow pause execution interface displays 404
    • DSS-876 [DSS-Workflow] Fixed that when the workflow node right-clicks the key script, the directory cannot display the latest exception
    • DSS-780 [DSS-Workflow] Fix the problem that the result set of the management console is not displayed completely when running a node with a result set
    • DSS-247 [DSS-Workflow] Fix the problem that datachecker fails to run after eventreceiver
    • DSS-877 [DSS-Workflow] Fix the workflow stuck when some nodes have succeeded but the status has not been flipped when the workflow is executed in real time question
    • DSS-632 [DSS-Workflow] Fix the workflow execution failure after the proxy user is set to a non-system username in the real-time execution workflow The problem
    • DSS-607 [DSS-Scriptis] Fix the problem that the workspace fails to import files to HDFS
    • DSS-665 [DSS-Scriptis] Fix the problem of invalid reference to global variables in IDE
    • DSS-635 [DSS-DAS] Fix data service execution failure exception
    • DSS-636 [DSS-DAS] Fix result set download error exception
    • DSS-813 [DSS-DAS] Fixed the issue that when multiple result sets were selected, the page turning of the previous result set would be brought over
    • DSS-878 [DSS-DAS] Fix the problem that the result set in CSV format is displayed without permission
    • DSS-801 [DSS-DAS] Fix the problem that the content of the running log page is incomplete after exporting the result set
    • DSS-620 [DSS-DAS] Fix the exception of visual error when running scripts with result sets
    • DSS-157 [DSS-DAS] Fix result set sorting issue
    • DSS-879 [DSS-Commons] Fix the inconsistency between calling sqoop and script code in shell
    • DSS-880 [DSS-Commons] Fixed the issue that inputting the vi a.txt command on the shell node would cause the script to keep running and the engine is busy
    • DSS-630 [DSS-Commons] Fix interface /api/rest_j/v1/dss/datapipe/backgroundservice to set proxy user to empty The exception that caused the error
    • DSS-625 [DSS-Commons] Fixed hadoop user not giving A user any permissions when creating a project, but A user workspace still displayed Exception for this item
    • DSS-791 [DSS-AppConn] Fixed occasional zip failure when AppConn was started
    • DSS-548 [DSS-Engine] Fix the problem that the running node is displayed as Python 2 after the Python 3 engine is killed

    Credits

    The release of DSS 1.1.0 is inseparable from the contributors of the DSS community. Thanks to all the community contributors, Including but not limited to the following contributors: rootljw, teenwolf0910, njnu-seafish, luban08, HanTang1, det101, KidUncle, mingfengwang.


    Compiled Version

    Tencent Cloud:

    DSS-1.1.0&Linkis-1.1.1 Compiled package (.tar.gz)

    Source code(tar.gz)
    Source code(zip)
  • 1.0.1(Feb 24, 2022)

    DSS-1.0.1 includes all of Project DSS-1.0.1.

    DSS-1.0.1 mainly contains three improvements and enhancements:

    • Adapt to Apache Linkis 1.0.3.
    • Deprecate Jersey and use Spring MVC to build HTTP RESTful APIs.
    • Optimize log printing of one click installation deployment script.

    Abbreviations: DSS: DataSphereStudio


    New Feature

    • DSS-444 [DSS-Eco] Adapt to Apache Linkis 1.0.3.
    • DSS-445 [DSS-Architecture] Deprecate Jersey and use Spring MVC to build HTTP RESTful APIs.

    Enhancement

    • DSS-448 [DSS-Package] Optimize log printing of one click installation deployment script.
    • DSS-521 [DSS-Document] Update README about the compatibility information of DSS with integrated third-party application tools.

    Bugs Fix

    • DSS-499 [DSS-Execution] Modify the reuse configuration parameters of the appconn engine to solve the problem that the appconn engine cannot be reused.
    • DSS-516 [DSS-Execution] Add the kill interface to solve the problem of cannot successfully kill a workflow.
    • DSS-470 [DSS-Apiservice] Modify the log printing class referenced by apiservcice module to solve the problem of module compilation error.
    • DSS-448 [DSS-Package] Update the DDL of the dss application component to solve the problem of incorrect display of component list data.
    • DSS-448 [DSS-AppConn] Modify the parameters of the AppConnEngineConnExecutor interface to solve the exception that occurs when the workflow node is deleted.
    • DSS-448 [DSS-Standard] Optimize the judgment logic of SSO reuse login status to solve the problem that the login cannot be redirected normally.
    • DSS-476 [DSS-Workflow] Modify the format of the original interface request parameters to solve the problem of error reporting in workspace creation.
    • DSS-521 [DSS-Orchestrator] Resolve the compilation problem about Pair class.

    Credits

    The release of DSS 1.0.1 is inseparable from the contributors of the DSS community. Thanks to all the community contributors!


    Compiled Version

    1. Tencent Cloud:

    DSS-1.0.1&Linkis-1.1.1 Compiled package (.tar.gz)

    Source code(tar.gz)
    Source code(zip)
  • 1.0.0(Sep 6, 2021)

    DSS-1.0.0 includes all of Project DSS-1.0.0.

    DataSphereStudio 1.0.0 is a major release marking the start of data application development and management framework, which brings both a variety of powerful new features and strong archetecture designation and implemenation of data application.

    The following key features are added:

    • Add dss-standard module, which is three kind of basic request protocol to integrate with upper-layer application systems.
    • Provides the ability to obtain and manage orchestrator instances.
    • Add workflow conversion standard module, provide the capability of converting a DSS workflow to the tasks of third-party application tools, such as the workflow of Schedulis.
    • Define AppConn core interface and default implementation class.
    • Provide the CRUD interface of the project in the framework module.
    • Provide the CRUD interface of the workspace in the framework module.
    • Supports data API service
    • Enhance the capability of DSS workflow execution server, adapt to the new architecture of DSS1.0.0
    • Optimize the packaged deployment module of DSS1.0.0
    • Compatible with linkis-1.0.2 and above
    • Add a new front-end interface of DSS1.0.0

    Abbreviations:
    DSS: DataSphereStudio


    New Feature

    Framework

    • DSS-364 [DSS-Framework] Provide common classes of framework modules.
    • DSS-365 [DSS-Framework] Provide CRUD interface for orchestration mode.
    • DSS-366 [DSS-Framework] Provide the CRUD interface of the project in the framework module.
    • DSS-367 [DSS-Framework] Provide the CRUD interface of the workspace in the framework module.
    • DSS-368 [DSS-Framework] Enhance the ability of the framework and provide the function of orchestration mode release.
    • DSS-369 [DSS-Framework] Define some public classes of the orchestration module.

    Orchestrator

    • DSS-352 [DSS-Orchestrator] Define some common classes for the orchestration module.
    • DSS-353 [DSS-Orchestrator] Define relevant interface specifications for orchestrator mode conversion.
    • DSS-354 [DSS-Orchestrator] Defines the basic class of the orchestration module.
    • DSS-355 [DSS-Orchestrator] Provides the orchestrator module with the ability to access database persistence.
    • DSS-356 [DSS-Orchestrator] Provides the ability to obtain and manage orchestrator instances.

    Workflow

    • DSS-349 [DSS-Workflow] Provide DSS workflow commons module.
    • DSS-350 [DSS-Workflow] Add workflow conversion standard module.
    • DSS-361 [DSS-Workflow] Provide DSS workflow sdk module
    • DSS-362 [DSS-Workflow] Enhance the capability of DSS workflow server,adapt to the new architecture of DSS1.0.0

    Standard

    • DSS-370 [DSS-Standard] Defines the third-level development standard of AppConn
    • DSS-371 [DSS-Standard] Provide DSS standard commons module
    • DSS-372 [DSS-Standard] Defines the second-level development standard of AppConn
    • DSS-373 [DSS-Standard] Defines the first-level development standard of AppConn
    • DSS-384 [DSS-Standard] Supports the integration of third-party applications using DSS standard

    AppConn

    • DSS-351 [DSS-AppConn] Realize the application of the three major access specifications of DSS1.0.0
    • DSS-357 [DSS-AppConn] Define AppConn core interface and default implementation class.
    • DSS-358 [DSS-AppConn] Implement DSS1.0.0 AppConn class loading and instantiation module.
    • DSS-359 [DSS-AppConn] Define the abstract dispatch AppConn.
    • DSS-360 [DSS-AppConn] Implement Linkis AppConn plugin.
    • DSS-363 [DSS-AppConn] Provide the ability of AppConn and AppInstance to persist the database.
    • DSS-377 [DSS-AppConn] Provide some basic function modules of appconn in DSS1.0.0

    ApiService

    • DSS-395 [DSS-ApiService] Supports data API service

    Web

    • DSS-376 [DSS-Web] Add a new front-end interface of DSS1.0.0

    Enhancement

    • DSS-347 [DSS-Workflow] Enhance the capability of DSS workflow execution server.
    • DSS-348 [DSS-Workflow] Enhance the capability of DSS workflow node execution module.
    • DSS-362 [DSS-Workflow] Enhance the capability of DSS workflow server, adapt to the new architecture of DSS1.0.0
    • DSS-380 [DSS-DataPipe] The data import and export service module provides an interface for data import and export.
    • DSS-381 [DSS-DataPipe] Provides the module for data import and export
    • DSS-388 [DSS-Package] Optimize the packaged deployment module of DSS1.0.0
    • DSS-390 [DSS-Config] Update the configuration file and database script file

    Bugs Fix

    • DSS-385 [DSS-Package] Modify the packaged deployment module of DSS1.0.0
    • DSS-389 [DSS-Package] Update the configuration file and database script file of DSS1.0.0
    • DSS-393 [DSS-ContextService] Add an interface to get ContextID
    • DSS-396 [DSS-Package] Remove some useless code and solve compilation problems
    • DSS-399 [DSS-Package] Optimize application startup script

    Credits

    The release of DSS 1.0.0 is inseparable from the contributors of the DSS community. Thanks to all the community contributors!


    Compiled Version

    1. Tencent Cloud:

    DSS-1.0.0 Compiled (.tar.gz)

    Source code(tar.gz)
    Source code(zip)
  • 0.9.1(Apr 2, 2021)

    DSS 0.9.1 is a version led by ChinaTelecom Ctyun Big Data Platform Team with the help of WeBank.

    This is the next release based on DSS 0.9.0 . This version aims to reduce the operation and maintenance costs of WeDataSphere components such as DSS, Linkis, and Schedulis for community users through the new feature of "new user initialization", so that DSS can use "new user initialization" to automatically initialize the execution environment of new created users.

    Since 0.9.0, this version contains 7 improvements and enhancements.

    Enhancement

    • [DataSphereStudio-274] Adding account creation function. Support users to customize configuration according to their own environment,currently available to create include(LDAP,management node account,HDFS&Linux directory ,scheduling account,hive database,keytab), because the environments of community users are quite different, users can also add new implementations or modify scripts to adapt to their own environment.

    • [DataSphereStudio - 288] Support creating users on the master machine

    • [DataSphereStudio - 289] Support creating workspace for the newly added user(support Linux and HDFS) and HDFS

    • [DataSphereStudio - 290] Support adding users to the Schedulis configuration and immediately overriding.

    • [DataSphereStudio - 291] Support initializing the environment needed by Hive for newly added users.

    • [DataSphereStudio - 292] Support for creating and distributing keytab files in Kerberos clusters for new users.

    • [DataSphereStudio - 293] Support adding users to the LDAP based system

    Credits

    The release of DSS 0.9.1 is inseparable from the contributors from the WeDataSphere community. They selflessly contribute their code and actively communicate with community partners. Without their help, DSS 0.9.1 cannot be successfully released. Thanks to all the community contributors! Rank in no particular order:

    luxl : Basic code and workspace module, scheduling authorization, and part of the front-end development of account creation. HanTang:LDAP module and Hadoop master node account creation function. lvjw: Documentation contribution. schumiyi : Fix the problem that the application prompts that it is not open source after the initial installation. tomshy1 : Front-end development of account creation . JsonLiuUp: Keytab creation and distribution, hive database activation and authorization. ldtong:Scheduling authorization.

    Upgrade Guide

    This version involves adjustments to some JARs. If DSS is running in your production environment, you only need to simply replace some of DSS Jars to upgrade. For details, please refer to: DSS 0.9 .1 Upgrade Guide. Please note: This version provides a new feature called "new user initialization" . In order to use this feature properly, please keep notice of DSS0.9.1 New User Initialization Usage Document.

    Cloud Resource

    DSS + Linkis + Qualitis + Visualis + Azkaban one-click installation package URL:dss_linkis.zip


    DSS 0.9.1 是在微众银行的倾力帮助下,由 天翼云 主导完成的一个版本。

    这是基于 DSS 0.9.0 的下一个发行版本。该版本旨在通过新增“新用户初始化”特性,为社区用户降低运维DSS、Linkis和Schedulis等WeDataSphere组件的运维成本,让DSS在新增新登录用户时,可以使用“新用户初始化”功能自动完成新用户的所有环境初始化操作。

    本次版本包含了7个改进和增强。

    特性增强

    • [DataSphereStudio-274] 新增新用户初始化特性。不同的环境,支持配置化创建新用户所需的所有环境信息,目前包括:LDAP、管理节点账号、HDFS和Linux目录、调度账号、hive库、keytab等,且预留了相关接口,允许用户自己新增实现,或者修改脚本适配自己的环境。
    • [DataSphereStudio-288] 新增在主控机上创建用户特性。
    • [DataSphereStudio-289] 支持给新用户创建工作空间(支持Linux和HDFS)。
    • [DataSphereStudio-290] 支持在Schedulis的配置中添加用户,并立刻重载。
    • [DataSphereStudio-291] 支持给新用户初始化HIVE所需环境信息。
    • [DataSphereStudio-292] 支持为新用户在Kerberos集群创建和分发keytab文件。
    • [DataSphereStudio-293] 支持给基于LDAP体系新增用户。

    新贡献者

    DSS 0.9.1 的发布,离不开WeDataSphere社区的贡献者,他们无私地贡献自己的代码,积极地与社区伙伴进行技术交流,有了他们的助力,DSS 0.9.1 才能顺利地发布,在此感谢各位社区的贡献者! 排名不分先后:

    luxl:本次版本的社区主导者。基础代码和工作空间模块,调度授权,以及创建账号部分前端开发。 HanTang:LDAP模块和hadoop主节点账号创建功能。 lvjw:贡献文档。 schumiyi:修复初始化安装后进入应用提示未开源的问题。 tomshy1:创建账号前端开发。 JsonLiuUp:keytab创建和分发,hive库开通和授权。 ldtong:调度授权。

    升级指南

    本次版本涉及对一部分JAR包进行了调整,如果您已经在生产环境使用了DSS,并且不想重新安装DSS的话,只需要对其中的几个Jar进行简单替换即可,具体请参考:DSS 0.9.1升级指南

    请注意:本次版本提供了新用户初始化的大特性,为了使您能正常使用该特性,请在安装和升级前,先阅读 DSS0.9.1新用户初始化使用文档

    云资源

    DSS + Linkis + Qualitis + Visualis + Azkaban一键安装包 链接:dss_linkis.zip

    Source code(tar.gz)
    Source code(zip)
  • 0.9.0(Jul 13, 2020)

    0.9.0 is an important version led by ChinaTelecom Ctyun Big Data Platform Team with the help of WeBank. This is the next release based on DataSphere Studio 0.8.0 line. This version aims to build a one-stop big data application development and management platform by introducing the concepts of "DSS Integration Standard" and "Workspace", which greatly enhances the user's big data development experience. DSS Integration Standard (DSS Integration Standard): It is the unified access specification of the application system composed of SecurityService from the AppJoint specification of DSS, so that enterprises can connect related big data products to DSS very easily One-stop big data application system portal display and management capabilities (DSS application access specifications will continue to be supplemented and improved in subsequent versions of DSS). Workspace specification: enables different types of data applications (such as workflow offline applications, real-time applications, data API services, etc.) to be organized and managed from a unified perspective, enhancing application organization and management capabilities. It contains 23 bug fixes, improvements and enhancements.

    Enhancement

    • [DSS-149] DSS Homepage optimization, mainly includes management items for operation and workspace creation, as well as cases and introductions.
    • [DSS-150] DSS Workspace Homepage optimization. After the administrator creates a space for the corresponding role (for example, developer), this page is the first page after login. Includes entries for common applications, administrator functions, and various types of applications.
    • [DSS-151] Common function configuration optimization of the workspace. The roles provided in the workspace can be freely configured and add common function entries that can be added and deleted.
    • [DSS-152] Optimize DSS front-end joint adjustment configuration. After adding this configuration, you can modify the back-end gateway addresses in different environments of the agent as needed, such as production and testing, to quickly locate the problem.

    Bug Fix

    • [DSS-120] add use cases such as HQL, PySpark, Spark SQL for new users, so that new users can test and use.
    • [DSS-176] footer-channel in iframe not work.
    • [DSS-185] 0.8.0-scriptis cannot show the execution result of python scripts.
    • [DSS-193] Replace DWS with DSS.

    Credits

    The release of DataSphere Studio 0.9.0 is inseparable from the contributors of the WeDataSphere community. They selflessly contribute their codes and actively carry out technical exchanges with community partners. With their help, DataSphere Studio 0.9.0 can be successfully released. Thank you all Contributors to the community! Rank in no particular order: det101: contributed front-end codes for workspace homepage, optimized local debugging configuration of front-end, and solved the bug that footer-channel in iframe didn't work. yuchenyao: contributed front-end codes for DSS homepage, and optimized the process of creating project and workflow. schumiyi: contributed back-end codes for DSS homepage and workspace homepage. ryanqin01: contributed database related codes。 AdamWang: contributed back-end codes for workspace homepage, and reconstructed part of remaining problems.

    Upgrade Wizard

    Due to the addition of the concept of user workspace (Workspace) in this version of DSS-0.9.0, if you are upgrading from DSS0.7 or DSS0.8 to DSS0.9.0, after completing the platform deployment, you need to make some adjustments to the database tables , Please visit: DSS 0.9.0 Upgrade Guide

    Cloud Resource

    1. DSS one-click installation package URL: https://pan.baidu.com/s/1unf0FqO6GrvUzYjnHOBWug Password:nnx9
    2. DSS + Linkis + Qualitis + Visualis + Azkaban one-click installation package URL:https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/DataSphereStudio/0.9.0/dss_linkis.zip
    3. dss-web one-click installation package URL: https://pan.baidu.com/s/1nS4iVildF4oXFdd1CPh3AA Password: 56nh
    4. linkis-jobType(Azkaban) URL:https://pan.baidu.com/s/1hkCaVFoIWJH5atedSqhd8Q Password:09pz

    0.9.0是在微众银行的倾力帮助下,由天翼云大数据平台团队主导完成的一个重要版本。 这是基于DataSphere Studio 0.8.0的下一个发行版本。该版本旨在通过引入“DSS应用接入规范”,“工作空间”等概念,构建一站式大数据应用开发与管理平台,极大提升了用户的大数据开发体验。 DSS应用接入规范(DSS Integration Standard): 是从DSS的AppJoint规范中,抽出SecurityService组成的应用系统统一接入规范,使企业可以非常简单地将相关大数据产品接入到DSS之中,快速具备一站式的大数据应用系统门户展示和管理能力(DSS应用接入规范将会在DSS的后续版本之中继续补充完善)。 工作空间规范:使不同类型的数据应用(如:工作流离线应用、实时应用、数据API服务等)能够以统一的视角进行组织管理,提升应用的组织和管理能力。 本次版本包含了23个Bug修复,改进和增强。

    特性增强

    • [DSS-149] DSS首页优化,主要包括运营入口和工作空间创建管理入口,也包括案例和入门。
    • [DSS-150] DSS工作空间首页优化,管理员创建空间分配给对应的角色之后,如开发者,那么该页面就是登陆之后的首页。
    • [DSS-151] 工作空间常用功能配置,提供给工作空间的角色自由配置增加常用的功能入口,可以增加和删除
    • [DSS-152] dss前端联调配置优化,增加该配置之后,可以根据需要修改代理不同环境的后端gateway地址,如生产和测试,快速定位问题。

    Bug修复

    • [DSS-120] 为新用户使用增加包括HQL、PySpark、Spark SQL等用户用例,方便新用户测试试用。
    • [DSS-176] 列表浮动块在iframe下面拖动存在问题。
    • [DSS-185] 0.8.0-scriptis执行python脚本无法显示结果。
    • [DSS-193] 历史遗留问题重构,将DWS替换为DSS。

    新贡献者

    DataSphere Studio 0.9.0的发布,离不开WeDataSphere社区的贡献者,他们无私地贡献自己的代码,积极地与社区伙伴进行技术交流,有了他们的助力,DataSphere Studio 0.9.0才能顺利地发布,在此感谢各位社区的贡献者! 排名不分先后: luxl :贡献工作空间功能的前端代码,优化前端本地调试配置,修复任务管理器入口在iframe下拖动不正常的问题 yuchenyao:贡献了首页的前端代码,优化了创建工程及工作流的流程。 schumiyi: 贡献了首页和用户工作空间相关后端代码。 ryanqin01: 贡献了数据库相关代码。 AdamWang:贡献了用户工作空间相关后端代码,以及部分历史代码遗留问题重构。

    升级向导

    由于本次DSS-0.9.0版本新增了用户工作空间(workspace)概念,如果您是从 DSS0.7 或 DSS0.8 升级到 DSS0.9.0,在完成平台部署后,需对数据库表做一些调整,具体请访问: DSS 0.9.0 升级指南

    云资源

    1. dss一键安装包 链接:https://pan.baidu.com/s/1unf0FqO6GrvUzYjnHOBWug 提取码:nnx9
    2. DSS + Linkis + Qualitis + Visualis + Azkaban一键安装包 链接:https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/DataSphereStudio/0.9.0/dss_linkis.zip
    3. dss 前端安装包 链接: https://pan.baidu.com/s/1nS4iVildF4oXFdd1CPh3AA 提取码: 56nh
    4. linkis-JobType(Azkaban) 链接:https://pan.baidu.com/s/1hkCaVFoIWJH5atedSqhd8Q 提取码:09pz
    Source code(tar.gz)
    Source code(zip)
  • 0.8.0(Jun 28, 2020)

    Enhancement

    • [DSS-158] Database - > Table - > query table. The pop-up temporary file name is dbName.tableName.hql Change to dbname_ tableName.hql.
    • [DSS-168] Update DataSphereStudio with linkis version 0.9.4 and solves some compilation errors

    Bug Fix

    • [DSS-107] Fixed the bug that multiple ips are gotten.
    • [DSS-116] Fixed the eventchecker plugin in the appjoint/scheduler module will have repeated signal consumption in extreme scenarios.
    • [DSS-157] Fixed the result set sorting error. Only the current column is sorted in ascending or descending order.
    • [DSS-159] Fixed eventcheck node parameters msg.body default setting error.
    • [DSS-160] Fixed datacheck cannot check the table correctly when the partition format is incorrectly filled in.
    • [DSS-161] Fixed run_date variable can not be shown correctly.
    • [DSS-162] Fixed flow runtime execute cost time is not correct.
    • [DSS-165] Fixed the bug that the script in the workflow has been executed, but the execution time has been increasing.
    • [DSS-167] Fixed 'utf-8' is not a valid value for the Content-Encoding header.

    Credits

    The release of DataSphereStudio0.9.4 is inseparable from the contributors of the WeDataSphere community. They selflessly contribute their codes and actively carry out technical exchanges with community partners. With their help, DataSphereStudio0.9.4 can be successfully released. Thank you all Contributors to the community! Rank in no particular order: 5herhom: Fixed bug with multiple IP addresses on the server that would cause installation and startup to fail. xing.huang: Contributed error using 'UTF-8' as content of the Content-Encoding. SelfImpr001:Contribute to the optimized message consumption code of eventcheck node.


    特性增强

    • [DSS-158] 数据库->表->查询表,弹出的临时文件名为dbName.tableName.hql改为dbName_tableName.hql
    • [DSS-168] 更新DataSphereStudio的Linkis版本为0.9.4,并解决一些兼容问题。

    Bug修复

    • [DSS-107] 修复安装脚本中遇到环境存在多个IP时出错问题。
    • [DSS-116] 修复EventCheck在特殊场景下重复消费消息问题。
    • [DSS-157] 修复结果集排序错误问题,只对当前列进行了升序或者降序。
    • [DSS-159] 修复EventCheck节点参数msg.body默认值设置错误问题。
    • [DSS-160] 修复DataCheck节点因DataObject参数格式错误检查出错问题。
    • [DSS-161] 修复DataCheck节点的run-date变量在前台日志中没有正确显示问题。
    • [DSS-162] 修复工作流实时执行节点显示执行时间不正确。
    • [DSS-165] 修复工作流脚本已经执行完成,但执行时间一直在增加问题。
    • [DSS-167] 修复使用错误的'utf-8'作为Content-Encoding内容的问题。

    新贡献者

    DataSphereStudio 0.8.0的发布,离不开WeDataSphere社区的贡献者,他们无私地贡献自己的代码,积极地与社区伙伴进行技术交流,有了他们的助力,DataSphereStudio 0.8.0才能顺利地发布,在此感谢各位社区的贡献者! 排名不分先后: 5herhom: 修复了服务器存在多个IP会导致安装启动失败的BUG。 xing.huang: 贡献了错误使用'utf-8'作为Content-Encoding内容的修复代码。 SelfImpr001: 贡献了EventCheck优化消息消费方式的代码。

    云资源

    我们提供了DSS + Linkis + Qualitis + Visualis + Azkaban【全家桶一键部署安装包】,由于安装包过大(1.7GB),Github下载缓慢,请通过以下方式获取

    1. Baidu cloud:

    URL: https://pan.baidu.com/s/1Jhq1vQB_gkYfbU-LMaUJJA

    Password: 3hvj


    以下为DSS后端安装包资源:

    2. Baidu cloud:

    URL: https://pan.baidu.com/s/1cbmd4rcjCYh4agZ26lsr2A

    Password: 3l1r

    以下为DSS前端安装包资源:

    3. Tencent Cloud:

    URL: https://share.weiyun.com/dJjSOAzE

    Password: ur580z

    DSS&LINKIS 一键部署脚本

    Tencent Cloud:

    URL:https://share.weiyun.com/58yxh3n

    Source code(tar.gz)
    Source code(zip)
  • 0.7.0(Feb 12, 2020)

    Enhancement

    • [DSS-25] Workflow supports python node type.
    • [DSS-97] Workflow supports shell node type.
    • [DSS-73][DSS-78] Workflow supports JDBC node type.
    • [DSS-90] (Epic Enhancement) One-click deployment for DSS + Linkis + Qualitis + Visualis + Azkaban.
    • [DSS-37] Add steps for environmental checks and service presence checks to DSS installation, and simplify the installation steps of the lite version.
    • [DSS-79] Optimized the one-click deployment script and added common variables, so that DSS independent deployment and full one-click deployment are not affected.
    • [DSS-88] The deployment user is added to the authentication configuration token.properties of dss-server by default.
    • [DSS-94] DSS Web installation optimization.

    Bug Fix

    • [DSS-26] After the Sendemail node is deleted, a NoSuchElementException will be issued in releasing project.
    • [DSS-56] When the node content is empty, it will cause the exception of deleting the visualis node to fail.
    • [DSS-80] Fixed an exception that caused DSS to fail to start after Linkis-common removed the httpclient dependency.
    • [DSS-81] Fix wrong JDBC node type definition in database.
    • [DSS-82] Fixed an error that EventReceiver nodes shared custom variables, and an exception that the number of status polling threads would increase infinitely.
    • [DSS-84] When visualis-server is started up in remote address, the status check will fail since the IP address is not converted to the remote actual IP address.
    • [DSS-86] Fix using "hostname -i" to get local host IP error in centos8.

    特性增强

    • [DSS-25] 工作流新增 Python 节点
    • [DSS-97] 工作流新增 Shell 节点
    • [DSS-73][DSS-78] 工作流新增 JDBC 节点
    • [DSS-90] 【史诗级强化】DSS + Linkis + Qualitis + Visualis + Azkaban一键部署
    • [DSS-37] 安装DSS增加环境检查、服务存在性检查,并简化精简版安装步骤。
    • [DSS-79] 优化一键部署脚本,新增通用变量,使DSS独立部署和全一键部署都受不影响。
    • [DSS-88] 默认将部署用户添加至dss-server的认证配置token.properties中。
    • [DSS-94] DSS Web安装优化。

    Bug修复

    • [DSS-26] 修复Sendemail节点删除后,发布工程会出现NoSuchElementException异常。
    • [DSS-56] 修复节点内容为空时,会导致删除visualis节点失败的异常。
    • [DSS-80] 修复Linkis-common移除httpclient依赖后会导致DSS无法启动的异常。
    • [DSS-81] 修复数据库中JDBC节点类型定义错误。
    • [DSS-82] 修复EventReceiver节点共享自定义变量错误,以及其状态轮询线程数会无限增加的异常。
    • [DSS-84] 修复visualis-server远程启动后,因IP地址未转换成远程实际IP地址,会导致状态检测失败。
    • [DSS-86] 修复centos8中使用”hostname -i” 获取本地主机IP错误。

    云资源

    我们提供了DSS + Linkis + Qualitis + Visualis + Azkaban【全家桶一键部署安装包】,由于安装包过大(1.3GB),Github下载缓慢,请通过以下方式获取

    1. Baidu cloud:

    URL: https://pan.baidu.com/s/1hmxuJtyY72D5X_dZoQIE5g

    Password: p82h

    2. Tencent Cloud:

    URL: https://share.weiyun.com/5vpLr9t

    Password: upqgib


    以下为DSS安装包资源:

    1. Tencent Cloud:

    URL: https://share.weiyun.com/5n2GD0h

    Password: p8f4ug

    2. Baidu cloud:

    URL: https://pan.baidu.com/s/18H8P75Y-cSEsW-doVRyAJQ

    Password: pnnj

    DSS&LINKIS 一键部署脚本

    Tencent Cloud:

    URL:https://share.weiyun.com/58yxh3n

    Source code(tar.gz)
    Source code(zip)
    azkaban-linkis-jobtype-0.7.0.zip(87.96 MB)
    wedatasphere-dss-0.7.0-dist.tar.gz(547.85 MB)
    wedatasphere-dss-web-0.7.0-dist.zip(20.99 MB)
  • 0.6.0(Dec 6, 2019)

    Enhancement

    • [DSS-4] Azkaban AppJoint optimized from mandatory to optional.
    • [DSS-5] Optimized the error message after failing to log in to Azkaban from the DSS side..
    • [DSS-7] If the submission user of Azkaban is empty, it is recommended to change the submission user to a proxy user.
    • [DSS-13] One-click deployment optimization.

    Bug Fix

    • [DSS-10] Optimization of QualitisAppJoint canExecute method.

    增强

    • [DSS-4] Azkaban AppJoint从必装优化为选装
    • [DSS-5] 优化从DSS端登录Azkaban失败后的错误信息。
    • [DSS-7] 如果Azkaban的提交用户为空,建议将提交用户修改为代理用户。
    • [DSS-13] 一键部署优化。

    修复

    • [DSS-10] Qualitis AppJoint canExecute方法判断方式优化。

    百度云资源

    链接:https://pan.baidu.com/s/1srn-V7WJDLy2QAPs5qGyHA

    提取码:iyya

    Source code(tar.gz)
    Source code(zip)
    azkaban-linkis-jobtype-0.6.0.zip(88.43 MB)
    wedatasphere-dss-0.6.0-dist.tar.gz(494.59 MB)
    wedatasphere-dss-web-0.6.0-dist.zip(20.99 MB)
  • 0.5.0(Nov 30, 2019)

    Core features

    1. One-stop, full-process data application development management UI
    2. Project, as the management unit, organizes and manages the business applications of each data application system
    3. AppJoint, based on Linkis,defines a unique design concept
    4. Various integrated data application components

       Welcome to download and use DSS!


    核心特性

    1. 一站式、全流程的数据应用开发管理界面
    2. 以Project为管理单元,组织和管理各数据应用系统的业务应用
    3. 基于Linkis计算中间件,打造独有的AppJoint设计理念
    4. 已集成了丰富的数据应用组件

       欢迎下载使用DSS!

    Source code(tar.gz)
    Source code(zip)
    linkis-jobtype-0.5.0.zip(88.45 MB)
    wedatasphere-dss-0.5.0-dist.tar.gz(459.96 MB)
    wedatasphere-dss-web-0.5.0-dist.zip(26.02 MB)
Owner
WeBankFinTech
WeBankFinTech
An All-in-one Visualization Framework for TiddlyWiki5 based on ECharts

ECharts for TiddlyWiki5 When I first started using TiddlyWiki a long time ago, I wanted TiddlyWiki to be able to visualize data. I wanted to generate

Tiddly Gittly 31 Dec 30, 2022
Apache Superset is a Data Visualization and Data Exploration Platform

Superset A modern, enterprise-ready business intelligence web application. Why Superset? | Supported Databases | Installation and Configuration | Rele

The Apache Software Foundation 49.9k Dec 31, 2022
A web application to 🔍inspect your GitHub Profile Stats📊 in a lucid way. Visualization made easy with Charts💡🚀

know-your-gitstats A web application to ?? inspect your GitHub Profile Stats ?? in a lucid way. Visualization made easy with Charts ?? ?? . ✅ Features

Shubham Jadhav 46 Oct 15, 2022
Apache ECharts is a powerful, interactive charting and data visualization library for browser

Apache ECharts Apache ECharts is a free, powerful charting and visualization library offering an easy way of adding intuitive, interactive, and highly

The Apache Software Foundation 53.8k Jan 9, 2023
Powerful data visualization library based on G2 and React.

BizCharts New charting and visualization library has been released: http://bizcharts.net/products/bizCharts. More details about BizCharts Features Rea

Alibaba 6k Jan 3, 2023
Apache ECharts is a powerful, interactive charting and data visualization library for browser

Apache ECharts Apache ECharts is a free, powerful charting and visualization library offering an easy way of adding intuitive, interactive, and highly

The Apache Software Foundation 53.8k Jan 5, 2023
Data Visualization Components

react-vis | Demos | Docs A COMPOSABLE VISUALIZATION SYSTEM Overview A collection of react components to render common data visualization charts, such

Uber Open Source 8.4k Jan 2, 2023
📊 A highly interactive data-driven visualization grammar for statistical charts.

English | 简体中文 G2 A highly interactive data-driven visualization grammar for statistical charts. Website • Tutorial Docs • Blog • G2Plot G2 is a visua

AntV team 11.5k Dec 30, 2022
Data visualization library for depicting quantities as animated liquid blobs

liquidity.js A data visualization library for depicting quantities as animated liquid blobs. For a demonstration of what the final product can look li

N Halloran 91 Sep 20, 2022
🍞📊 Beautiful chart for data visualization.

?? ?? Spread your data on TOAST UI Chart. TOAST UI Chart is Beautiful Statistical Data Visualization library. ?? Packages The functionality of TOAST U

NHN 5.2k Jan 2, 2023
A data visualization framework combining React & D3

Semiotic is a data visualization framework combining React & D3 Interactive Documentation API Docs on the wiki Examples Installation npm i semiotic E

nteract 2.3k Dec 29, 2022
🌏 A Declarative 3D Globe Data Visualization Library built with Three.js

Gio.js English | 中文 React Version: react-giojs Wechat minigame: wechat usage Gio.js is an open source library for web 3D globe data visualization buil

syt123450 1.6k Dec 29, 2022
📊 Data visualization library for React based on D3

Data visualization library for React based on D3js REAVIZ is a modular chart component library that leverages React natively for rendering the compone

REAVIZ 740 Dec 31, 2022
Location Intelligence & Data Visualization tool

What is CARTO? CARTO is an open, powerful, and intuitive platform for discovering and predicting the key insights underlying the location data in our

CARTO 2.6k Dec 31, 2022
🍞📊 Beautiful chart for data visualization.

?? ?? Spread your data on TOAST UI Chart. TOAST UI Chart is Beautiful Statistical Data Visualization library. ?? Packages The functionality of TOAST U

NHN 5.2k Jan 6, 2023
Globe.GL - A web component to represent data visualization layers on a 3-dimensional globe in a spherical projection

A web component to represent data visualization layers on a 3-dimensional globe in a spherical projection. This library is a convenience wrap

Vasco Asturiano 1.3k Jan 3, 2023
Timeline/Graph2D is an interactive visualization chart to visualize data in time.

vis-timeline The Timeline/Graph2D is an interactive visualization chart to visualize data in time. The data items can take place on a single date, or

vis.js 1.2k Jan 3, 2023
An open-source visualization library specialized for authoring charts that facilitate data storytelling with a high-level action-driven grammar.

Narrative Chart Introduction Narrative Chart is an open-source visualization library specialized for authoring charts that facilitate data storytellin

Narrative Chart 45 Nov 2, 2022
Open source CSS framework for data visualization.

Charts.css Charts.css is an open source CSS framework for data visualization. Visualization help end-users understand data. Charts.css help frontend d

null 5.7k Jan 4, 2023