- Work in interesting and challenging projects.
- Flexible working hours.
- A great, non-corporate atmosphere.
- Stable employment (contract of employment or B2B contract).
- Opportunity to develop your career and work with new technologies.
- Attractive benefits package.
- Remote work.
- Take care of maintenance and new developments in LDA Hadoop platform.
- Applying best practices to optimize hardware setup and also improve data structures to get best possible performance for analytics and serving data to downstream systems.
- Develop complex multisource ETL pipelines and Data Quality Assurance modules.
- Monitor performance and suggest enhancements in production cluster.
- Develop integrations modules for Data Catalog.
- Automate maintenance tasks in Hadoop cluster.
- Act as 2nd/3rd line support for production application on Hadoop cluster.
- Develop, refactor SQL/PSQL based applications in technologies like BigSQL, Hive potentially Impala.
- Develop Rest API applications serving Hadoop data
- Develop Monitoring routines and simple dashboards.
- Experience in SQL, MSSQL and/or DB2 with minimum 3 years of experience in queries optimizations.
- Knowledge and experience in Data Quality and Data Catalog.
- Experience with Apache Hadoop would be a plus
- Knowledge in Sqoop, Nifi, Linux, Hive/BigSql, Python, Spark/PySpark
- Strong knowledge on Data Warehousing principles of design enough to demonstrate good ETL implementation logic.
- Analytical mindset and good problem-solving skills.
- Good communicates communication skills.
- English language proficiency.