Take a peek on what projects we've worked so far - here you can find extensive information
about several projects in which we described the objectives, the challenges and the results.
Developing a new multi-platform which will integrate all the needs of each institution around the country proved essential to the government’s effort to realize digital transformation.
As most institutions were acting isolated, each having its own applications and own databases, communication and integration between different institution services seemed nearly impossible. Previously, each citizen had to register to each service and have different credentials/accounts, his data being stored independently on each service.
The multi-platform which will be developed to serve all the citizens' needs in a one-integrated system needs a centralized data solution from which it can fetch relevant information.
As most of the independent databases were having different structures and different schemas, it needed a solution that will gather all the information from all the databases and structure it so that the new centralized data lake will contain relevant information that will be valuable to the new multi-platform.
1) Azure Data Factory
In the world of big data, raw, unorganized data are often stored in relational, non-relational, and other storage systems. However, on its own, raw data doesn't have the proper context or meaning to provide meaningful insights to analysts, data scientists, or business decision makers.
Data was structured in different ways. It needed a solution that will extract the data and integrate it in a centralized way and to be able to orchestrate and operationalize processes to refine these enormous stores of raw data into actionable business insights.
Azure Data Factory provided a data integration and transformation layer that worked across the organization’s digital transformation initiatives by executing complex hybrid extract-transform-load (ETL), extract-load-transform (ELT), and data integration projects.
With the data being now centralized and organized, the multi-platform can be now developed and all the needs of each institution around the country will be integrated, proving essential for delivering centralized services.
Now institutions can pull data from the centralized data lake and gain valuable information about each registered citizen in real time, thus delivering public services in a fast and secure manner.
The normal user, the citizen, with the help of the new platform will be able to have just one account for all institutions, making his relation to all public services seamless. He will be able to access services through the multi-platform, making actions like paying taxes or requesting official documents easy, with just a few clicks.
Azure Data Factory proved to be a simple, fast and secure way to migrate, structure and manipulate data. This way the organization was able to choose how the process will work, which data will be selected and stored into the new centralized platform and also create new valuable data and data insights. With the help of Azure Data Lake Storage Gen2, storing these massive amounts of data was done seamlessly, having also the possibility to create hierarchical namespaces.