400 606 5609
Experience DEMO
In April 2018, the General Office of the State Council issued the Guidelines on Promoting the Development of “Internet+ Healthcare”, calling for the coordinated advancement of a unified, authoritative, and interconnected national health information platform. A key goal is to enable seamless data sharing across departments, regions, and sectors. “Interconnectivity” has since become a central theme in the healthcare industry—but behind it lies the long-standing challenge of “information silos” and lack of data sharing.

For instance, HIS, LIS, PACS, HRP, relational databases and non-relational data all coexist, and various structured and unstructured data need to be integrated.
Hospital data mainly includes electronic medical records and imaging data, etc. Taking imaging data as an example, the data volume is huge and growing rapidly, and extremely high requirements are imposed on data synchronization stability and real-time performance.
Various information systems employ different technologies and standards, making the integration of data structures and the consistent processing of disparate data the fundamental bottleneck in the construction of a data fusion platform.
DataPipeline has set up various types of data source interfaces, enabling unified access to all kinds of data sources both inside and outside the medical institution, as well as automatic data exchange at the destination. This can meet the access requirements for all heterogeneous data within and outside the medical institution.
By leveraging the Change Tracking mechanism of SQL Server, the changed data is captured, and the business database data is synchronously synchronized to the Oracle data warehouse in real time within seconds.
DataPipeline has shattered the traditional perception of data fusion platforms as being intangible and unobservable. It adopts a visualized data fusion management interface. Even data professionals without a professional background in big data technology can configure a data pipeline independently within a short period of time.

For medical institutions, real-time data access is of vital importance in emergency situations. DataPipeline optimizes and adjusts the data transmission across various types of databases to ensure the efficiency of data transfer and achieve sub-second latency.

Systems that cannot be connected to the ESB will synchronously transfer data from HIS, LIS and other information systems to the data center in real time through DataPipeline for centralized processing and management, ensuring the connectivity and sharing of data.

Traditional data fusion tools have high requirements for operators and require at least several hundred thousand man-hours of investment each year. DataPipeline adopts a visualized data fusion management interface, which significantly reduces operation and maintenance costs.