watermark picture

How to design and setup a Lakehouse architecture using Azure Synapse/Databricks

During this course we'll guide you trough the most important concepts within the lakehouse architecture and explore delta lake and Apache Spark.

The lakehouse architecture is quickly becoming the new industry standard for data, analytics and AI. It proposes a solution to the most important challenges the established data architectures, Data warehouse and Data lake, are facing.

In this training 

we'll guide you trough the most important concepts within the lakehouse architecture and explore delta lake and Apache Spark.

After this training 

you will have the necessary insights to design and setup a Lakehouse architecture using Azure Synapse or Databricks.

This training is for 

data architects, engineers and developers.

Related cases

Related blogs

Why lift-and-shift isn’t copy-and-paste

Lift-and-shift is potentially a very efficient method to move your applications to the cloud. You need to be aware, though, of the implications of the pay-as-you-go pricing model that comes with a cloud deployment. Check out our 3 tips to ensure lift-and-shift delivers the most cost-effective solution.

Read More

AI: The Future is Now - Are You Ready to Embrace it?

This blogpost will look into the evolution that AI has made in order to become what it is today. Next to that we will look forward, how AI can likely impact our future. Lastly, we will try to formulate a plan of attack that companies need to consider to be best prepared for the drastic changes that AI will have on our way of working.

Read More

How to query your S3 Data Lake using Athena within an AWS Glue Python shell job

AWS Glue, the serverless ETL service of AWS, supports two types of jobs: Spark and Python shell. In this article, we'll focus on Python shell jobs and explain how you can make optimal use of your S3 Data Lake using Athena within Python shell jobs.

Read More

What an event-driven architecture brings to the table to solve your data ingestion challenges

Before you can generate insights from your data, you need to move those data from an operational to an analytical environment - a process commonly referred to as data ingestion. An event-driven architecture provides an elegant way to achieve a process marked by timeliness, performance, and cost-effectiveness.

Read More