Data Engineer

 

Description:

We are the premier provider of data feed products to the global investment community. As we continue to expand our coverage - not only in the US, but Europe and Asia as well - and improve our products and solutions, it is increasingly important to complement our best-in-class services with an exceptional organization of people.

We pursue excellence in everything we do. We value results, encourage teamwork, and embrace change. Our team is responsible for the design, architecture, develop, and implement Salesforce CPQ& Billing and Oracle Financial Cloud applications for the organization wide needs.
 



Responsibilities

  • Responsible for end-to-end implementation of MS SQL Server scripts including(SQL views, Stored Procedures, Triggers, Advanced Joins, etc)
  • Should be able to perform data extraction, data loading solving the errors, checking filter criteria.
  • Understanding on the steps - input, output, and connection to database.
  • Understanding on job/transformation, inspecting the data.
  • Understanding on salesforce basics - fields, objects, and lookups.
  • Knowledge on schema, database, views.
  • Provide technical knowledge/expertise to support the requirements definition, design, and development of data warehouse components/subsystems, specifically Extract-Transform Load or Extract-Load-Transform technologies and implementations and/or Change Data Capture technologies and implementations.
  • Getting fallout report and analyzing the bugs and errors, should understand how to get output file.
  • Research, design, develop, and modify the ETL and related database functions, implementing changes/enhancements to the system.
  • Design ETL processes and develop source to target transformations, and load processes.
  • Develop, test, integrate and deploy Kettle transformations including performing configuration management activities related to the ETL/CDC environment.
  • Resolve requests for assistance in troubleshooting issues assigned to the team.
  • Support functional/regression testing activities in support of system functionality releases, patches, and upgrades.
  • Support Production jobs and debug for any issues during Failures, perform the RCA for failure and fix the issue to resume the operations.
  • Knowledge in scheduling tools like TWS, Informatica Schedulers, and SSIS a plus.
  • Analyze process improvement areas/recommend changes to processes/procedures for efficiencies/cost-savings/etc.
  • Work in partnership with key business users, identify potential ways of improving the efficiency and/or effectiveness of current business operations
  • Build a deep technical understanding of how the business operates departmental/divisional structure, functions, processes, procedures, and current application functionality
  • Assist with the design of MS SQL solutions and project planning. Add value in all stages of project work (definition, development, deployment)
  • Strong background in Pentaho, Informatica, or similar ETL tolls.
  • Must know how to migrate data from using tools like data loader, work bench, dataloader.io to perform data migration.
  • Lead & Coordinate with QA, UAT and Go-Live Activities.

What We’re Looking For

  • Thorough knowledge of delivering projects in an agile scrum environment
  • Able to provide leadership, participate and be a productive member of the team.
  • Must have strong SQL Skills
  • Must be able to review the code related to customization as well that of integration.
  • Be the lead subject matter expert in driving the industry best practices for the Salesforce ecosystem and associated integrated tools.
  • Experience with Developing and maintaining Code using MS SQL,SSIS, Pentaho Data Integration.
  • Experience in Developing Pentaho Data integrations for integrations with Salesforce, Flat files, MS SQL server, Oracle, and Other Applications.
  • Experience working with salesforce Data objects, its Validations and writing soql queries to extract data from salesforce
  • Prior experience of Transforming data from a Legacy Salesforce and Zuora implementation to Salesforce CPQ and Billing modules implementation using Pentaho Data Integrations.
  • Strong analytical and debugging skills.
  • Experience with Designing Large scale data integration solutions with high performance.
  • Knowledge of Oracle applications and performing Integrations with them.
  • Knowledge on Zuora Subscription and Billing model is a plus.
  • Experience working in Agile methodology and able to deliver code within stipulated time.
  • Experience in writing Performance efficient complex SQL queries, Stored procedures in Microsoft sql server and tuning existing queries and Procedures.
  • Performing Unit testing, and code deployments and Maintenance activities on existing code
  • Performing Load runs to Higher environments and Resolving any Bugs identified.
  • Experience working with or having knowledge on other ETL tools like Informatica is a plus

Organization S&P Global
Industry Engineering Jobs
Occupational Category Data Engineer
Job Location Islamabad,Pakistan
Shift Type Morning
Job Type Full Time
Gender No Preference
Career Level Experienced Professional
Experience 4 Years
Posted at 2023-09-16 12:53 am
Expires on 2024-12-18