New Data Capabilities - A New Paradigm
Moving to Azure gives the term Data Platform a completely new paradigm. In the old paradigm it was often equal to a SQL Server which included the storage, compute and the data integration tools in the same "box". The outcome was often "just" a Data Warehouse with structured data used for reporting and limited use for much else.
With cloud computing you have in short:
- Flexible and scalable storage possibilities
- Separate compute power which can be scaled up or down
- Many data handling tools at your disposal depending on the data volumes and complexity
- The output is much more flexible than before and can be used to enable operational processes, external APIs, real-time-analytic scenarios etc.
Data platform in Azure
Depending on the business requirement the storage can have different "temperatures" (hot, cold or archive), the data structure varies from very structured (with full governance), semi-structured to completely unstructured and the data latency can between batch, near-real-time to actual real-time.
Once you have your data platform in Azure you have many different data handling tools at your disposal, from a classic ETL (Extract Transform and Load) tool like Azure Data Factory to AI with Azure Machine Learning or real-time analytics with IoT Hub and further to Power BI as a self-service and/or visualization front-end (you can also use the Synapse which can be seen as a complete data suite of services covering all you need to establish what Microsoft call a Data Estate).
All in all, it means that the Azure enabled Data Platform is able to support the business and their changing requirements in a much more flexible and agile way. Once you have your Data Platform in Azure you can establish different data sets, test different scenarios and run PoCs quickly, you can "fail fast", learn and continue with limited resources or calendar time.
How to realize it – a Data Roadmap
Given the almost endless capabilities and opportunities with an Azure enabled Data Platform Innofactor recommends that you as an organization start with what we call a Data Roadmap. It is a quick and high-level process where we, with our knowledge and experience with the possibilities (and challenges) with Azure, work closely with you, and your knowledge about your business and the relevant challenges and opportunities.
It is a very structured process where you start with your current state, together we define an ambitious destination for your data platform and then we make a high-level roadmap of how to get there across 3 perspectives:
- Organization, processes and governance
- Data and Systems
- Platform infrastructure
End deliverable - a high-level roadmap
The end deliverable is a high-level roadmap for the coming 2-3 years showing how you can realize the relevant data capabilities to support your business strategy. It does not mean that you will have to migrate all workloads to Azure, it could be that certain areas of your data platform are just fine where they are, and you will wait for a "compelling event" to justify a migration.
In the process it is important that we together find and define the critical business drivers supporting and justifying the journey. In Innofactor we have defined 6 overall drivers but you as anorganization might have more or different drivers:
- Business development and transformation
- Democratization of data
- Business process optimization
- GDPR and compliance
- Modernization of legacy
- Enablement of cloud adoption