Data Engineer

 

Description:

  • Responsible for end-to-end implementation of Pentaho -KTR scripts.
  • Should be able to perform data extraction, data loading solving the errors, checking filter criteria.
  • Understanding on the steps - input, output, and connection to database.
  • Understanding on job/transformation, inspecting the data.
  • Understanding on salesforce basics - fields, objects, and lookups.
  • Knowledge on schema, database, views.
  • Provide technical knowledge/expertise to support the requirements definition, design, and development of data warehouse components/subsystems, specifically Extract-Transform Load or Extract-Load-Transform technologies and implementations and/or Change Data Capture technologies and implementations.
  • Getting fallout report and analyzing the bugs and errors, should understand how to get output file.
  • Research, design, develop, and modify the ETL and related database functions, implementing changes/enhancements to the system.
  • Design ETL processes and develop source to target transformations, and load processes.
  • Develop, test, integrate and deploy Kettle transformations including performing configuration management activities related to the ETL/CDC environment.
  • Resolve requests for assistance in troubleshooting issues assigned to the team.
  • Support functional/regression testing activities in support of system functionality releases, patches, and upgrades.
  • Support Production jobs and debug for any issues during Failures, perform the RCA for failure and fix the issue to resume the operations.
  • Knowledge in scheduling tools like TWS, Informatica Schedulers.
  • Analyze process improvement areas/recommend changes to processes/procedures for efficiencies/cost-savings/etc.
  • Work in partnership with key business users, identify potential ways of improving the efficiency and/or effectiveness of current business operations
  • Build a deep technical understanding of how the business operates departmental/divisional structure, functions, processes, procedures, and current application functionality
  • Assist with the design of Pentaho solutions and project planning. Add value in all stages of project work (definition, development, deployment)
  • Strong background in Pentaho.
  • Must know how to migrate data from using tools like data loader, work bench, dataloader.io to perform data migration.
  • Lead & Coordinate with QA, UAT and Go-Live Activities.

Organization S&P Global
Industry IT / Telecom / Software Jobs
Occupational Category Data Engineer
Job Location Islamabad,Pakistan
Shift Type Morning
Job Type Full Time
Gender No Preference
Career Level Intermediate
Experience 2 Years
Posted at 2023-03-27 2:44 pm
Expires on 2024-12-27