I’ve been in data warehousing business since 2010. I started as a consultant for SQL Server. I experienced SQL Server from nearly every possible angle. I was assigned at the local KBC branch as a database administrator but also (although unofficially) an SQL and ETL developer. When I left KBC, I was developing ETLs, cubes and reports on SQL Server platform. By the end of 2013, I wanted to move on so I started looking for a new job. As a farewell to the local SQL Server community I gave a presentation about SQL Server, its scaling, server configuration, going to production and backup and disaster recovery plan. The presentation was given in the Czech language and is available on YouTube.
In February 2014, I joined Teradata and started learning another database engine from the scratch. In April, I was assigned to Lufthansa customer and I got my first hands on experience with Teradata database and BTEQ loading scripts.
I realized I would like to move myself from the pure relational world into the world of Big Data. I did some research on the internet and read Taming The Big Data Tidal Wave by Bill Franks to get a general overview. Based on the conducted research, I made an action plan for the upcoming 12 months.
- Take Java classes at my uni
- Write a bachelor thesis on Big Data
- Take a data scientist course
- Learn Hadoop – Hive, Pig, MapReduce
- Learn Aster for data visualization
So far, every point from the plan is on its way to success. Additionally, I attended two training. First, Apache Hadoop 2.0 Data Analysis with the Hortonworks Data Platform using Pig and Hive presented by a great instructor and now a good friend, Jessica Marceau. Second, Aster Implementation and Management Workshop presented by Mark Ott.
Both training were enormously exciting and gave me a lot of inspiration for my future showcases so keep in touch.