Infosys is seeking a
Data Platform Engineer (
Bigdata/Python) with experience in platform engineering and administration preferred. The position will primarily be responsible interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle including Requirements Elicitation, Application Architecture definition and Design. You will play an important role in creating the high-level design artifacts. You will also deliver high quality code deliverables for a module, lead validation for all types of testing and support activities related to implementation, transition and warranty. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued.
Required Qualifications: - Candidate must be located within commuting distance of Charlotte, NC or be willing to relocate to the area. This position may require travel in the US and Canada.
- Bachelor's Degree or foreign equivalent, will consider work experience in lieu of a degree
- At least 4 years of experience with Information Technology
- 2+ years of experience in Hadoop ecosystem, i.e. Hadoop, Hbase, Hive, Scala, SPARK, Sqoop, Flume, Kafka, Python
- 2+ years of experience in Python programming - strong understanding and hands-on programming/scripting experience skills - Python, UNIX shell, Perl, and JavaScript.
- Knowledge in object oriented concepts, data structures and algorithms
- Experience in end-to-end implementation of DW BI projects, especially in data warehouse and mart developments
- Experience in installing, configuring, supporting, maintaining & tuning IDE tools like Jupyterhub, Jupyter Enterprise Gateway, Zeppelin for Data Scientists
- Experience in Scala, Python, PySpark, SparkR, Big Data ML toolkits such as TensorFlow, SparkML or H2O
- Experience in Integrating Big Data tools and frameworks to support business use cases
- Experience in design and administering graphDB, neo4j and Tiger graph platform
- Execute PoC and establish connectivity methodologies with Hadoop, Hive and Teradata
- Interact with business teams to gather requirements
- Assist infrastructure team with sizing
- Interact with H2O support on functional issues and solutions
- Lead team activities from functional and technical point of view
- Guide team members with implementation
- Knowledge and experience with full SDLC lifecycle
- Experience with Lean / Agile development methodologies
- U.S. Citizenship or Permanent Residency required, we are not able to sponsor at this time
Preferred Qualifications: - At least 3 years of experience in software development life cycle
- At least 3 years of experience in Project life cycle activities on development and maintenance projects
- 3+ years of experience in Hadoop ecosystem, i.e. Hadoop, Hbase, Hive, Scala, SPARK, Sqoop, Flume, Kafka, Python
- Experience in developing data science/analytics pipeline, i.e. starting from installation of various packages, data engineering, data analytics is preferred.
- Working experience with Machine learning libraries, H2O, NLP is preferred
- At least 1 year of experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data
- Good experience in end-to-end implementation of DW BI projects, especially in data warehouse and mart developments
- Good understanding of Data integration, Data Quality and data architecture
- Experience to Big data technologies is preferred.
- Good expertise in impact analysis due to changes or issues
- Experience in preparing test scripts and test cases to validate data and maintaining data quality
- Strong understanding and hands-on programming/scripting experience skills - UNIX shell, Perl, and JavaScript
- Experience with design and implementation of ETL/ELT framework for complex warehouses/marts. Knowledge of large data sets and experience with performance tuning and troubleshooting
- Hands-on development, with a willingness to troubleshoot and solve complex problems
- CI / CD exposure
- Ability to work in team in diverse/ multiple stakeholder environment
- Ability to communicate complex technology solutions to diverse teams namely, technical, business and management teams
- Experience managing team size of 2-3 would be a plus
- Excellent verbal and written communication skills
- Experience and desire to work in a Global delivery environment
The job may entail extensive travel. The job may also entail sitting as well as working at a computer for extended periods of time. Candidates should be able to effectively communicate by telephone, email, and face to face. About UsInfosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation.
With over three decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.
To learn more about Infosys and see our ideas in action please visit us at www.Infosys.com
EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National Origin