About the job
As a Data Architect, you use modern data engineering techniques and analytics to help clients turn information into insight. You collect, aggregate, store and reconcile data from multiple sources, helping to design and build data pipelines, streams, reporting tools, data generators and a whole range of tools to provide information and insight. You develop the predictive analytics that helps clients succeed.
Day to day, you will:
- Share your knowledge and give solution support on proposals, supporting sales teams
- Manage the requirements gathering, analysis and design phases of projects
- Analyze large and complex data, using quantitative and qualitative methods
- Deliver data to users using common desktop programs
- Share your knowledge of best practices in analytics platforms, business intelligence, data management and Big Data
- Produce reports that show key performance indicators (KPIs), improvement opportunities, and problem root causes
- Identify new analytics tools and techniques
- Create and maintain report forms and formats, information dashboards, data generators, canned reports and other information portals and resources.
You understand how to turn piles of data into actionable insights that our clients can use to their competitive advantage. You can explain complex analytical concepts in non-technical language that business people can understand.
Your skills and experience include:
· Working with analytical models or techniques and knowing how to apply them in real-life business applications
- 5+ years' experience architecting and delivering Hadoop solutions
- 10+ years' experience architecting and delivering BI/EDW projects
- A technical degree in computer science or engineering
- Familiarity with Windows and Linux operating systems
- Previous experience working in a client-facing role
- Experience with Microsoft Azure Databricks OR Databricks
- Experience with Hadoop distributions such as Hortonworks and Cloudera as well as their associated open source projects (MapReduce, Impala, etc.)
- Specialized experience with NiFi, Kafka, Spark, and Hbase, with Scala based coding in Spark Streaming and SparkSQL
- Experienced in the DevOps continuous integration lifecycle
- Experience working with statistical and data visualization tools and techniques
- Structured problem-solving
- Project and people management skills
- Advanced knowledge of Microsoft SharePoint, PowerPivot, SQL Server Reporting Services and Excel.
You probably have a Bachelor’s degree in Computer Science or Software Engineering or a related field. You may also have Master’s degree.
You’ve gained at least five years of experience of quantitative analysis in a business setting, including building statistical models. You’ve had a management role for around three years and the same time or more in consulting.