Following Microsoft’s recent unveiling of Fabric, a ton of videos have flooded the internet, making it hard to keep pace, and even though I was part of the private preview I wanted to take some time to digest all the information before sharing my first thought about Fabric’s new release.
While there are already plenty of blogs or tutorials about Fabric I also noticed that there were some claims about Fabric just being a rebranding or just a new way to change the licensing.
Throughout this post, I will try to give an honest opinion on what is Fabric and if it is a real new product or just a rebranding…
Table of Contents
What is Fabric
So to keep things short and simple I describe Fabric as an end-to-end Software-as-a-Service (SaaS) solution, including data storage, data integration, data science, real-time, and reporting analytics. All together in one single shared platform that provides data governance as well as security.
So is Fabric really new?
As a platform, Fabric is definitely new and really seems like a promising product, however, all the tools included in Fabric are not necessarily new they are individual products brought together from different places of Azure. Some are unchanged and some have undergone enhancements also some have already seen integration in PowerBI or Synapse. Now let’s take a deeper look at what Fabric really brings to the table and see what products are new, enhanced, or the same as before.
Fabric is like Synapse 3.0
You may wonder why I call Fabric Synapse 3.0. Well, to understand this, we have to look at the first two versions of Synapse.
Azure SQL DWH (Synapse 1.0)
The first version of Synapse, released in 2016, was essentially a massively parallel processing database available as PaaS through Azure. While it integrated well with Azure AD and Azure Data Factory, it had its own limitations.
Just as an example, to perform data movement with ADF from a data lake to Azure SQL DWH we had to have several tabs opened in our navigators and switch from one another as all the tools were not included together, also many SQL functions were not available in Azure SQL DWH.
Synapse Analytics (Synapse 2.0)
Synapse Analytics started in 2019 and it was a PAAS platform that included several tools together the platform was called Synapse Studio. In Synapse Studio we could access and use several tools in a single place the list of available tools was as follows: (note that some of the tools have been added only more recently)
- Synapse dedicated pool (former Azure SQL DWH)
- ADF (or also called Synapse pipeline)
- SQL Serverless
- Data Lake Gen2 was directly available in Synapse studio
- Data flow
- Spark notebook with spark pool (was a bit slow to start each time a few mins)
- Delta tables (yes you hear me with Spark we could create and work with Delta already)
- Lake database (supported csv, parquet, and delta formats)
- Power BI Dataset creation and even PBI report creation
I may have forgotten some features but as you can see Synapse was already pretty close to what Fabric has to offer today or maybe not…
As an owner of the Synapse platform and more generally the Azure platform I can tell you that there’s a lot of management overhead to get things done, such as configuring all storage account access on the security side, on the network side, and on the CI/CD side. As for the billing side you pay for the storage, for the bandwidth, some features are paid per hour, serverless is per volume of data processed etc.
So in the end, it is quite hard to have a global picture of what you pay and how much you will pay in the future, it is also hard to know who has access to what unless you buy external tools or put in place some automation scripts to audit your tenant.
Fabric (Synapse 3.0)
So why do I call Fabric, Synapse’s third version, well, to me, Fabric is filling the gap of all the things that were missing in Synapse.
Here are all the features in Fabric that were not in Synapse or that were not built-in features.
- Fabric is cross-database
- Shortcuts (including other cloud providers such as Amazon S3)
- Fabric as an out-of-the-box feature that converts data source data to Delta
- Serverless engine performance improved
- Power BI alert and Power Automate action + better monitoring experience
- Git integration has been improved (I haven’t tested it)
- One Lake (instead of multiple lakes)
- Use T-SQL on top of Delta files
- Full integration of all the PBI features + better admin control by domains
- Event hub integrated
- Data Science environment integrated (I haven’t tested it)
There may be some other features that I missed and there may be some smaller features that I did not put there as I wanted to point out only the major changes included in Fabric but obviously, the major change is that all these features as offered as SAAS thus require only a few configurations on the admin side.
One Lake is an enhancement of Azure Data Lake Storage gen2 and it is in my opinion the greatest enhancement they have made. So technically it’s not new since it’s still based on Data Lake Gen2 technology but the way that they design it and simplified it is just amazing, one lake, one single place where you can store, transform, share, and secure your data.
As for the One Lake tool, it is like Azure Storage Explorer but designed for One Lake.
It is new for accessing data from external cloud providers, however, it is not new for data residing in Azure Data Lake as we could already leverage the CETAS table in SQL serverless without moving any data, however, to convert data to Delta format we had to at least transform the data once manually using either dataflow or spark pool.
As seen before Delta format is not new and was already supported by SQL Serverless pool as well as the Spark Pool. However, there’s a massive change here, Fabric automatically converts the data that you load into Delta Format and this is a huge improvement. The only way to work with Delta Tables before was to write some code using Spark or use the DataFlow pipeline and you had to do that every single time you needed to load/transform data to your lake.
Here except for the AI assistant that automatically generates pipelines using natural language, I did not see anything new. And so far I’m still not convinced that using natural language will help you to develop your pipeline faster as you still need to understand what kind of pipeline you need, and what it needs to do. If you know how to explain the problem well enough you’re most likely able to solve it.
It is not new but they have added some great enhancements such as the Delphix service also the generate by example is great even though it was already present in PBI desktop, I’m not aware if any improvements have been made there.
As opposed to generating a pipeline with natural language I think that being able to generate a pipeline by example is an incredible feature that can be definitely used in self-service and as far as I could test in PowerBI desktop it was working quite well!
To be fair, I haven’t had time to explore this piece yet, so to me, it is a combination of Event Hub and Stream Analytics which were both individual products in Azure but are both integrated into Fabric and are parts of the Real-time analytics experience and I think Microsoft has brought some amazing enhancements there as well.
Well everyone knows Power BI and everyone knows that this is not new at all, the only parts that are new are the use of Copilot to generate DAX code or reports and I will discuss it in the Copilot section.
And of course, the Direct Lake is a brand new feature and as far as I know and read the performance is really good!
Copilot in itself is new but what Copilot does is not really new at least for the Power BI side.
We could already use Q&A in PBI as long as all the synonyms were defined correctly, I have to admit that I never used Q&A in my life except for Demo…
As for generating reports, we could already leverage the feature “Get Insights” from the dataset or some of the existing AI visuals and it was also possible to generate DAX code using the Quick measures.
Now copilot will also generate code in Spark, write ML models, write SQL code, etc so except for PBI it is completely new.
However, I tend to be a bit skeptical about the use of AI to write code, especially if you don’t understand the code generated so for technical people it will for sure enhance productivity but for non-technical people, there’s a risk of missing copilot and produce wrong contents.
Before Data Activator we could create an alert via Flow or PBI alert and generate action using power automation, however, I really like the way they combine everything together and they have also greatly improved the monitoring side, being able to see all the alert triggered at a glance is a great enhancement.
Even though, I’m not sure about how much it will cost as far as I can understand we will pay a single price for the whole platform so this is going to make things so much easier. If you want to know how much using all the tools included in Fabric separately will cost you using I wish you good luck!
So is Fabric new? The short answer was “Yes” and the long answer is of course “Yes” even if most of the individuals tools in Fabric already existed and still exist today and will certainly exist for a while they have made things work seamlessly together quite well and this is just the beginning…
I will of course write more content on Fabric from now on as there are still a ton of new things to try out.