Programmer Analyst III – Talend

Department: Data Engineering
Job Type: Full – Time
Location: Nationwide

Job Overview:
Experienced Data Engineer who can create and manage ETL processes to load data from various sources into the Enterprise Data Warehouse. You will participate in design, development, testing, execution and quality assurance of both automated and manual data integration and exchange processes. You will also develop and deliver POCs and implementation of projects from a hands-on approach.
Should be able to handle and solve challenges that come with large volumes of data and complex business rules. You will research, evaluate, architect, and deploy new tools, frameworks, and patterns to build and develop sustainable Data Engineering solutions.
Responsibilities:
• Closely follows the direction set by delivery management and senior team members to achieve common goals. Assist senior team members in collecting client requirements; analyze requirements, configuration, testing and validation of the software application.
• Need to understand and translate client needs into business and technology solutions via documentation, analysis, design and/or development of best practice business changes.
• Act as liaison between senior consultants and client users to provide support, communications regarding project updates, preparation, data analysis, documentation and training.
• Develop requirements, perform data collection, cleansing, transformation and loading to populate facts and dimensions for data warehouse.
• Involve with performance tuning of SQL and understand the goals and risks associated with the business and technical requirements and offer counsel on risk mitigation and the alignment of data solution with objectives.
• Develop, validate and implement efficient ETL code for data integration layers such as data warehouse or data marts to deliver multiple project.
• Coordinate activities with data source application owners to ensure integration, data integrity and data quality.
• Provide production support for all ETL Processes.
• Maintain and manage technical metadata and documentation for data warehousing and ETL processes.
• Be a team player, able to work with minimal supervision yet also should be able to collaborate with local, remote teams with multiple departments.
• Good at interfacing with the client on a professional and consultative level.
Skills and Experience Required:

• Analysis, design, development and implementation of new data engineering requirements and providing estimates to support timelines and deliverables
• Have worked in Agile, Waterfall and Scrum environment.
• Install, configure Talend Enterprise/Open Studio version 6.x, 5.x product Suites including TAC, MDM, Version Control, Scheduler etc.
• Administer the Talend Server, databases, domain configuration, security and authentication etc.
• Design Data Marts and Data Warehouse using Star Schema and Snowflake Schema in implementing Decision Support Systems.
• Design and build data integration jobs: Standard Jobs and Big Data/Hadoop Jobs using Talend.
• Develop ETL using either native database utilities and/or third-party tools in an enterprise environment with a relational database backend.
• Work with the Hadoop ecosystem (HDFS, Hive, MapReduce, Spark, etc.). Practical experience in Spark (Scala or Java) is a plus.
• Exposure to Talend Data Quality, Master Data Management or Metadata Management solutions.
• Develop using AWS services (EC2, S3 and EMR) and Cloud applications concepts (multi tenancy, elasticity and scalability).
• Write UDF using Java/Spark in Talend.
• Work with RDBMS OLTP or OLAP data stores, such as Oracle, SQL Server, Sybase, or Teradata
• Extract data from a variety of data stores including relational databases, RESTful web services, LDAP querying and a variety of flat file structures (e.g. EDI, CCD, HL7)
• Work with the process and technologies used for modeling data in motion as well as data at rest across various formats and platforms like XML, JSON, traditional relational databases and schema on read platforms like HADOOP.
• Experience working with one or more Source Control tools (SVN / TFS / Rational ClearCase etc.)
• Solid experience in new development, maintenance and infrastructure support roles.
• Must have a flexible attitude and be able to comfortably accommodate change.
• Excellent problem solver, able to assimilate information quickly.
Educational Qualifications:
• Required – Bachelor’s degree in Computer Science, Information Technology, Computer Engineering or closely related or equivalent
• Preferred – Master’s degree in Management Information Systems (MIS), Computer Science, Big Data or Analytics or equivalent
Travel
• Open to travel based up on the nature of the engagement
Equal Employment Opportunity
DataFactZ employment does not discriminate on the basis of race, religion, gender, sexual orientation, age or any other basis as covered by federal, state, or local law.
Employment decisions are based solely on qualifications, merit and business needs.