Developer - Big Data and Analytic Services

Economical Insurance Group   •  

Kitchener, ON

Industry: Accounting, Finance & Insurance

  •  

5 - 7 years

Posted 88 days ago

This job is no longer available.

WHY ECONOMICAL?

Because a changing industry calls for a new way of doing things. An ambitious, innovative and fast-paced company, we offer exceptional training and development programs, competitive pay, great benefits, company-matched RRSPs, and paid volunteer days — all within an engaging, fun, and collaborative work environment.

Founded more than 145 years ago, Economical is one of Canada’s leading property and casualty insurance companies. We rely on our national network of more than 800 independent brokers to sell a range of car, home, business, and farm insurance solutions. With Sonnet and Petline, we’re extending our reach through the direct-to-customer channel. With more than 2,400 active employees across the country and a commitment to rethinking the insurance experience, we’re poised for great things.

We’re now adding to our high-performance team to take us into the future. Let’s rethink insurance, together.

Who We’re Looking For

o You’re highly flexible and able to quickly adapt to changing priorities

o You’re excellent at identifying and articulating problems and influencing decision-makers

o You’re highly organized, self-motivated, customer oriented and able to work independently as well as within a collaborative team environment, and with internal business and technology partners at all levels of the organization

o You recognize when customers might not be satisfied with the solution even though the solution meets requirements and are comfortable recommending alternatives

o You act as an expert resource providing insight and recommendations based on industry and technology trends, system strategy and design


What will your responsibilities be?

Solutions Design & Development

o Develops high quality solutions and enhances Big Data pipeline to extract data from various source systems using Big Data technology, and load into Enterprise Data Lake

o Collaborates with the Technical Lead to automate unit tests, applying development practices such as TDD and Pair Programming

o Responsible for unit testing, support Story testing and system testing their code

o Proactively identifies technical debt and seeks ways to improve via refactoring of code

Support and Maintenance of Big Data Platforms

o Provide assistance to the application support team in troubleshooting and resolving production issues.

o Verifies that solution documentation is complete, accurate, auditable, and is traceable to business and/or systems requirements

Expertise with SDLC (Lean/Agile Expertise)

o Collaborates with the Scrum Master and the team to develop and maintain the project Kanban System, and develop the Sprint Plan and Release plan

o Assists in the translation of requirements into a story map in collaboration with the Discovery Team

o Designs solutions according to the specific needs of requirements being specified during the current sprint only

Relationship Management

o Supports the team to obtain stakeholder buy-in and acceptance for application and technical designs

o Works collaboratively with the System Integration partners, Designers, Architects, Technical Lead, Business Analysts, Technical Testers and other Developers to collaborate on detailed designs

o Communicates project status and provides timely escalation of issues to ensure project objectives are met

IT Operational Expertise

o Participates in knowledge transfer within the team and business units

o Ensures that design and development knowledge is codified, monitored, tracked and managed

o Coordinates / facilitates training and communication of key knowledge assets with all required SMEs

Risk Management

o Assesses the likelihood of something going wrong based on the complexity of the solution and other influencing factors such as the experience level of the individual or the team, the newness of the technique, application or language, the condition of the source data, etc.

o Creates a risk response (mitigate, ignore, transfer, accept) that eliminates the risk or minimizes the impact of risks that become an issue

o Adheres to existing processes / standards, business technology architecture, risk and production capacity guidelines

Your Skills and Experience:

Must-have

  • 5+ years’ experience in building Java/Scala projects

  • 2+ years of experience in developing ETL pipeline and analytics applications in Java/Scala for Big Data Hadoop platforms such as Cloudera or Hortonworks

  • Solid experience with Apache Hadoop, HDFS, Spark, Hive, Impala and other big-data technologies

  • Experience with Kafka, Spark streaming, Flume, messaging services (MQ) and batch and real-time streaming/ingestion techniques

  • Experience with or other NoSQL schema design

  • Experience with developing applications utilizing different file formats such as XML, JSON, Avro, Parquet etc.

  • Strong experience with SQL and knowledge of Data warehouse design & data modelling concepts.

  • Solid experience with Linux Shell/ Bash scripts and overall Linux OS concepts.

  • 3+ years’ experience in Agile/Scrum development practices using JIRA, Confluence

  • Experience with continuous integration/delivery best-practices, technologies and tools such as Bitbucket/GitHub, Jenkins, and Artifactory

  • Experience developing cloud based applications with AWS, Azure etc.

Nice-to-have

  • Experience with Docker, MicroServices, Kubernetes

  • Experience with RESTFUL web services development

  • Experience with Python and machine learning algorithms/libraries etc

  • Experience with ETL tools such as Pentaho, Talend

  • Experience with Job scheduling tools such as Control-M, Oozie

    8450