Upon general request: six years of Datashift, three more lessons learned.

Two weeks ago, we posted a blog article about the three lessons we’ve learned in the past six years of Datashift. We received such great responses to our list of “don’ts,” I decided to share with you a list of three “do’s.” So, back by popular demand, here are another three important lessons six years of Datashift have taught us.


Read More

Six years of Datashift, three lessons learned

Datashift’s story started on January 15th, 2015. From the get-go, it was clear we wanted to do great things with Datashift. But building a company from the ground up is never a straight line. There are bumps in the road, highs, and lows, but most importantly: lessons learned. As we celebrate six years of Datashift, I'd like to take a look back and share exactly some of those lessons.

Read More

How to build a churn prediction model that actually works

The truth is, predicting churn is easy. The hardest part is making it actionable. With this approach you’ll retain only your valuable customers that are about to churn, with a personalized retention action at the right time.

Read More

5 ways to leverage your existing data with minimal effort

Do most data blogs make you feel like everything you should do is expensive, time consuming and requires complex skills? Well, this blog is different. Here are 5 practical recommendations (in no particular order) to leverage your existing business data with less effort than you think.

Read More

How to auto-scale your Azure SQL database in Azure Data Factory

One of the advantages of running databases on Azure SQL Database is the ability to dynamically manage them to adapt to changing workload demands. Autoscaling databases is the most cost effective way to increase performance of data operations. In this guide we'll show you how to auto scale your database using Azure Data Factory.

Read More

Kick-start your data governance program with a business glossary

No one likes misunderstandings. We've all been in a "the Italian man who went to Malta" situation before. Annoyingly enough, this situation happens in a lot of businesses too. Many discussions between departments have the same root challenge: different people speaking different 'languages'. In this article, we’ll set the stage to take on this challenge.

Read More

How to import an Excel or CSV through the Collibra API

A thriving Collibra instance is populated with a wide spectrum of data such as business terms, policies, code values, metadata of schemas, tables & columns, …. To keep Collibra relevant within an organization, this data needs to be accurate and up-to-date. Collibra provides tools like Collibra Catalog and the Collibra API to automatically import data with a set frequency.

Read More

Azure Data Factory vs on-premise ETL tools

Choosing the right ETL tool for your company is a complex task. Both Azure Data Factory and on-premise tools have their strengths and weaknesses. It's important to understand the parameters and nuances involved to pick the right tool for the job. In this article we'll clarify the key differences and help you make the right decision for your business.

Read More

Microsoft Azure Data Factory as a must-consider alternative for on-premise ETL – tools

When using cloud-based technology your data is processed, stored and maintained in the cloud and not on a physical server at your organization.

This means that no infrastructure is necessary for the set-up and you don’t need to worry about system maintenance. This helps in saving resources (time and money) at the start of a project which can be used to understand requirements of the business. Your cloud solution will also be more adaptable to changing situations : newer features can be added easily and up-scaling is only one click away. By the hand of the different data security protocols and features, you can sleep soundly that your data will be secure in the cloud.

Read More