Data Engineer in Fort Lauderdale, FL

$80K - $100K(Ladders Estimates)

CBS Corporation   •  

Fort Lauderdale, FL 33301

Industry: Media

  •  

Less than 5 years

Posted 60 days ago

This job is no longer available.

DESCRIPTION:

Division Overview:

Our team is a diverse and agile group of engineers that run data operations for CNET Media Group Business Intelligence team. We are responsible for developing tagging, data pipelines and data products to drive user growth, engagement and revenue opportunities. This position will focus primarily on CBS Interactive's CNET Media Group properties, including CNET, GameSpot, TVGuide, ZDNet and Techrepublic among others.

Role Details:

As a Data Engineer, you'll be working on developing workflows to ingest, store, and process data leveraging Google Cloud Platform products and services with emphasis on scalability and reliability. You will be working closely with Business Intelligence, Product, Revenue Optimization and other Engineering teams to build and enhance our BigQuery Data Warehouse. This role provides opportunity to work on the latest cloud and open source technologies to develop and evolve our data analytics platform.

Your Day-to-Day:

  • Collaborate with stakeholders to understand data needs and provide end-end data solutions
  • Develop and enhance ELT/ETL pipelines to ensure data availability and data quality
  • Create new data models to support data products and intuitive analytics
  • Use your Python and SQL coding skills to process and transform data
  • Research and promote data engineering best practices

Key Projects:

  • Migration of existing on-prem data pipelines to cloud
  • Develop and enhance internal revenue data models and pipelines
  • Explore usage of new GCP data products and features (Data Catalog, BI Engine, BigQuery ML)

QUALIFICATIONS:

What you bring to the team:

You have -

  • Bachelor's degree in Computer Science or equivalent experience in a related field
  • 3+ years of hands-on experience working in data warehousing or data engineering environment
  • Strong Python and SQL programming skills
  • 2+ years experience developing data solutions on GCP or AWS
  • Strong experience in authoring, scheduling and monitoring of workflows (Airflow, Luigi)
  • Experience in ingestion of data from external APIs and data stores
  • Experience in design, build and operationalization of big data pipelines on distributed processing back-ends (Cloud Dataflow, Spark, Flink)
  • Strong communication & interpersonal skills
  • Can-do attitude on problem solving, quality and ability to execute

You might also have -

  • Experience implementing best practices for monitoring, alerting, and metadata management
  • Knowledge of Git, Jinja2, Docker, Bitbucket, and Bamboo
  • Google Cloud Certified - Professional Data Engineer certification would be a plus

EEO STATEMENT:

Equal Opportunity Employer Minorities/Women/Veterans/Disabled

Valid Through: 2019-9-16