Description:
Local and fully vaccinated candidates
Job Description
We are seeking skilled, enthusiastic and experienced technologists to join our data integration engineering team located at New York, Montreal and Georgia.
The candidate will be responsible for working with the existing ETL engineering team to complete the Informatica version 10.2 to version 10.5 version across the enterprise. This role requires extensive experience on deploying and supporting large scale Informatica solutions.
Primary responsibilities include troubleshooting and helping development teams with best practices and on-boarding, monitoring, optimizing and tuning implementations as well as developing self-service tooling. The ideal candidate is one who has experience building data ETL infrastructure solutions and is proficient with multiple database platforms such as Sybase, DB2, Hadoop, and programming/scripting language such as JAVA, Perl on Linux.
QUALIFICATIONS Experience-
8+ years Required Skills:-
5+ years of experience in design, development, implementation, unit testing, troubleshooting and support of ETL process with Informatica- 5+ years working experience with at least three major DBMS products (e.g. Sybase ASE, DB2 UDB, MSSQL, Oracle, Hadoop, Greenplum, MongoDB, and/or Postgres) with knowledge of internals and implementation details of databases
Hands on 2+ years of experience with software development using either JAVA or Python on Linux. Strong experience with system performance tuning on Linux. Strong fundamentals in distributed system design, development and deployment using agile/devops practices
Excellent skill in verbal and written communication . Experience building RESTful APIs. Experience in shell scripting .Experience with tools such as GIT, Jira and Bitbucket . A self-starter with the ability to work effectively in teams
Desired Skills:-
Experience with other ETL tooling such as TalenD, SSIS. Experience with other data pipeline tooling such as Luigi, Nifi, Airflow. Experience with data discovery and visual analytical tools such as PowerBI, Tableau. Experience with public cloud services, terminology and security/control requirement. Deep knowledge of JVM internals- Python Knowledge- Contributor/Committer to open source projects
Experience working and managing software on a large distributed environment.Working knowledge of data virtualization platform (such Tibco Data Virtualization, Denodo, IBM Infosphere Information Services, Presto). Experience working with Docker & Kubernetes