Title: ETL Analyst / Developer
Sector: Financial Services
Our client is a fin-tech driven global provider of consumer credit services; a relatively young business experiencing considerable growth across the countries in which they currently operate…this is a truly exciting company.
The ETL Analyst creates ETL code/packages for loading data into a Reporting Database, Data Warehouse, or Data Lake (Big Data). The ETL analyst will be responsible for creating efficient ETL that is reusable, secure and performant. An ETL analyst will play a complimentary role in business intelligence, advanced analytics, data integration and master data management efforts and will work with Data Analysts and DWH Architects.
- Developing ETL code/packages based on in-house standards, currently Pentaho (kettle).
- Build reports in SAS, R, Pentaho, Jasper, Goodata or other open source tool for reporting data quality
- Document in a CASE tool, such as Enterprise Architect
- Support the loading of data into the Data Warehouse/Data Lake (sometimes outside of normal working hours)
- Ensuring that the ETL is robust and easy to operationally support
- Ensure that Data Security and privacy is fully understood for the data and that a secure environment is maintained.
- Critically evaluates information gathered from multiple sources, reconciles conflicts, classifies the information in logical categories.
- Identifies and documents sources of existing data as well as the new data. Understands use of master and reference data including sources and contributors.
- Documents the source to target mappings for both data integration as well as web services (consumer/provider mappings) that can be easily understood by the project team members with data quality and transformation rules
- Collaborates with data scientists and business partners and conducts data profiling and predictive analysis using a variety of standard tools
- Have joint accountability with data stewards and data analysts on the projects to ensure conformance to enterprise data governance policies around information risk and data protection guidelines. ETL analysts should be familiar with common policies around data masking and protection schemes
- Data source identification, data profiling, interpretation of patterns and trends, assessment of and improvement of data quality, versatility with visualization tools and techniques to share analysis findings. It is beneficial for individuals to have worked as a data analyst in any of the following types of projects:
- Analytically Intensive – Intended to draw/develop insights from data that is collected
- Operationally Data Intensive – Intended to create authoritative data sources that are essential to core transactional processing
- Data Integration Intensive – Requires the merging of many data types to satisfy business requirements.