The Sr. Data Analyst is responsible for moderate to complex data profiling, analysis and mapping, including entity-relationship data designs that align with Unum’s Enterprise Data Management Strategy and Architecture.
This candidate works closely with Business Analysts, Data Analyst, Data Architect, the business units, and Application development groups. They participate in the conceptualization, design, development and delivery of solutions within assigned business portfolios. May research third party/cloud solutions to ensure successful integration of the technology. They will be deep data subject matter and technology skillset subject matter experts. The Senior Data Analyst plays a critical role in mentoring Data Analysts and Associate Data Analysts to ensure we are growing future talent within the agile teams.
Principal Duties and Responsibilities
• Conduct moderate to complex data profiling and analysis to evaluate data sources to determine the best source for business information.
• Complete source to target data mapping specifications (e.g. source to target can be from one DBMS table to another DBMS table, from a DBMS table into a canonical message structure, etc.)
• Partner with subject matter experts, architects and engineers to capture and analyze business needs to lead the creation of all data modeling related artifacts.
• Translate and transform business requirements into complex data models (conceptual and logical).
• Design moderately complex to complex, flexible data models (conceptual and logical) through collaborations with analysts, engineers, and Physical DBA.
• Collaborate with the Data Architect and Physical DBA to translate the logical data model into a physical database design.
• Create/Capture documentation (metadata) that is up-to-date.
• Recommend architectural standard improvements.
• Work with integration teams to ensure that the model design and development is properly communicated.
• Contribute to the development of enterprise data management standards and procedures, guidelines, and best practices.
• Contribute to the development of data quality requirements, policies, data rules, reporting, and remediation.
• Contribute to the refinement and standardization of data modeling practices to align with application paradigm (e.g. Business Intelligence, Transactional Processing, Big Data).
• Collaborate with the test engineers to perform data validation and testing activities as appropriate.
• Adhere to approved architectural standards.
• Mentor core resources as designated by team leader.
• Thinks with the mind of the end customer at all times, ensuring solutions seek to improve the customer experience and delight their customers.
• Strong team player; able to work effectively within a team and more broadly with people from a variety of backgrounds and areas across the organization.
• Excellent communication, facilitation and negotiation skills
• Ability to work as part of a team and interact effectively with others
• Ability to embrace change, adapt to the unexpected, and focus energies, people, and solutions on practical and positive results
Job Specifications (Required)
• Bachelors Degree preferred, or equivalent experience
• 6+ years experience in Data Analysis or equivalent work experience
• Proficient in resolving data anomalies and conducting data integration analysis
• Proficient in the Data Management discipline
• Strong SQL skills and strong data analysis skills
• Understands how to read XML
• Familiarity with both Relational Data and UML Modeling
• Familiarity with the concepts of Operational Data Store and Data Warehouse
• Proven data profiling experience (i.e. single column, table structural, cross-table, etc.)
• Proficient in developing data mapping specifications, including identification of data rules, defining transformation rules, etc.
• Understand and utilize Software Development Life Cycle (SDLC)
• Data Management certification(s) is a plus (e.g. DAMA, IDMA, ICCP) is a plus
• Experience in the financial and insurance industry is a plus
• Experience with Agile Development Methodologies
• Knowledge of Big Data/NoSQL platforms (i.e. Hadoop, Cassandra, etc.)