As the Global Data Architect, you'll help us unite our product teams across a common set of RGAX goals and support our teams with alignment to RGA's policies and procedures. Focus will include the architecture of big data and machine learning infrastructure and services that will power global capabilities across geographically dispersed teams and product entities. This includes fostering communication and cooperation across dispersed product teams, establishing and owning a global data architecture strategy, and providing data architectural leadership while maintaining information security and compliance standards across various geographical regions.
- Participate in RGAX Global strategy and vision definition, enabling expanded internal and commercial value through data across entity and product lines. Maintain a flexible architecture and governance model as pillars of RGAX strategy development in close collaboration with RGA.
- Creation, maintenance and management of RGAX Global organization data architecture models, spanning from conceptual holistic vision of the company's cross-product architecture, including big data architecture, through logical and/or physical models where applicable.
- Assess use of internal and acquired data, how data relates to business value and operations and predict how changes will affect data use, then manipulate the data architecture to compensate.
- Act as data / big data architecture consultant to RGAX Global product teams.
- Work with teams to design and build large scale data processing pipelines, analytics sandboxes, and at-scale production delivery platforms.
- Support data pipeline infrastructure through effective utilization of AWS, GCP and other cloud services.
- Work with business stakeholders to identify business opportunities that may result and/or benefit from application of data and advanced analytics, including machine learning. Clearly define the hypothesis, gate criteria and success criteria to deliver the environments necessary to realize value from those opportunities.
- Aid in technology selection as the business defines new features requiring expanded system capability.
- Support the design and continual enhancement of RGAX Global set of shared data capabilities to accelerate product teams, aligning to RGAX business needs.
- Participate in RGAX M&A due diligence and integration strategy planning as related to acquisition of new data sources.
- Use creative and results-driven problem solving to lead engineering efforts for product teams.
- Maintains regular and predictable attendance.
- Performs other duties as assigned.
Education and Experience
- Bachelor's degree Computer Science or Information Systems or equivalent related work experience
- 5+ years experience performing infrastructure engineering related roles
- 2+ years of hands-on experience with big data and data-at-scale platform services
- 1+ year of experience with AWS or GCP services and infrastructure supporting work in big data and application development space.
- Experience working with data in various compression formats (including Parquet, ORC) and serialization (AVRO).
- Experience working with AWS or other cloud platform big data tools such as Elastic Map Reduce (EMR), S3, Redshift
- Experience with or understanding of container services including Docker and Kubernetes.
- Experience working with AWS Lambda or other serverless approaches for data processing pipelines.
- Utilize Python or Scala for big data processing with Apache Spark or Hadoop.
- Experience in infrastructure automation on AWS with Terraform or similar approaches
- Experience with NodeJs or similar frameworks
- Experience working with bash scripting and the AWS CLI in Linux-based systems
- Experience with securely storing and transferring large data assets
- Experience deploying platform monitoring and performance tracking services
Skills and Abilities
- Database architecture, data structures, Hadoop, Cassandra, Spark, MapReduce, Storm, Kafka and/or other big data technologies
- Experience in building large scale data processing pipelines using technologies like Apache Spark, Hadoop, and Hive for both structured and unstructured data assets
- In-depth understanding of data management (permissions, recovery, security and monitoring)
- Familiarity with cloud services (AWS, Azure, Google Cloud)
- Solid knowledge and understanding of SQL and NoSQL data stores.
- Experience designing, developing, deploying, testing, and deploying in AWS or comparable cloud architecture.
- Experience in software development and supporting the delivery of advanced data and analytic projects are essential.
- Comprehensive grasp of data visualization methods and data modeling
- Experience with one or more tools such as Tableau, SSRS, Power BI, Qlick and/or similar programs
- Experience with data security by design and has background of working with some subset of sensitive data (PII, PHI, HIPAA and GDPR)
- Strong communication skills, including ability to write industry white papers, publish content and speak at industry events.
- Comfortable in voicing his/her opinion and representing the viewpoints of others in a productive manner.
- Ability to deal with ambiguity and rapid change in a pro-active and positive manner.
- Some experience with agile methodologies.