LotLinx, the leading automotive AI-powered digital marketing tool, is experiencing tremendous growth and has an exciting opportunity for a Data Engineer located in our Winnipeg office. LotLinx provides employees a dynamic work environment that is challenging, team-oriented, and full of passionate people. We offer great incentives to our employees, such as competitive compensation and benefits, flex time off, and career development opportunities.
Reporting to the Head of Machine Learning/AI team, the Data Engineer is responsible for the development and deployment of ETL processes, Data management, Data Warehousing in the Automotive Digital Advertising industry. LotLinx is looking for a candidate that has talent with data to improve, optimize and lead further development of our data aggregation processes.
Requirements [must haves]
- BS degree in Computer Science or related technical field, or equivalent practical experience.
- Strong analytic skills related to working with unstructured datasets.
- Solid understanding and working knowledge of relational or non-relational databases
- Proficiency in a major programming language (e.g. Java/C) and/or a scripting language (scala/php/python)
- Implementation experience with Data gathering, Data pipelining, Data Standardization, Data Cleansing, Stitching aspects will be a plus.
- Innately curious and organized with the drive to analyze data to identify deliverables, anomalies, and gaps and propose solutions to address these findings.
- Experience with Amazon Web Services platform and/or Google Cloud Platform.
- Experience with tools such as Apache Beam, Apache Flink, Dataflow, and Apache Kafka would be an asset.
- Work with stakeholders including Analytics, Product, and Design teams to assist with data related technical issues and support their data infrastructure needs.
- Engineer solutions for large data storage, management, and curation of training data models.
- Explore available technologies and design solutions to continuously improve our data quality, workflow reliability, scalability while reporting performance and capabilities.
- Act as internal expert in each of the data sources so that you can own overall data quality.
- Design, build and deploy new data models, ETL pipelines into production and data warehouse.
- Define and manage overall schedule and availability of all data sets.