Oracle Developer


Portland, OR

Industry: IT Consulting/Services


5 - 7 years

Posted 299 days ago

  by    Cynet Sytems

This job is no longer available.

Job Description:

  • Strong Skill set on complex database projects with performance tuning is MUST.
  • Design, develop, troubleshoot and debug software programs for databases, applications, tools, networks etc.
  • Strong SQL and data analysis skills (including investigative troubleshooting skills)
  • Strong data testing & validation experience
  • Strong problem solving and troubleshooting skills
  • Ability to work well and communicate clearly with technical team members
  • Demonstrated accountability and independence in work

Educational, Certifications and/or Other Professional Credentials:

  • A Bachelor?s degree in Computer Science, Engineering or equivalent experience is preferred with five years related experience.
  • Experience with Oracle?s core products, applications, and tools is important.

Required Skills:

  • Strong Skillset on complex database projects with performance tuning is MUST.
  • Database administration
  • Table design including normalization
  • Database backups and recovery is a plus but not necessary
  • Understanding the way databases work with cloud applications is also a plus
  • Reviewing query performance and optimizing code
  • Writing queries used for front-end applications (websites, desktop applications, or cloud apps)
  • Designing and coding database tables to store the application?s data
  • Data modeling to visualize database structure
  • Working with application developers to create optimized queries
  • Creating database triggers for automation, e.g., automatic email notifications
  • Creating table indexes to improve database performance
  • Programming views, stored procedures, and functions
  • Expert knowledge of Oracle Database with Oracle 10g/11g core products and applications, and tools is important.
  • Write and tune SQL including database queries, ddl and dml, stored procedures, triggers, user defined functions, analytic functions, etc.
  • Create code that meets design specifications, follows standards, and is easy to maintain
  • Own features that you develop end to end. Work with end users on requirements gathering, develop and test your code, implement new processes in production, then maintain and support them over time
  • Drive our data platform and help evolve our technology stack and development best practices
  • Develop and unit test assigned features to meet product requirements
  • Experience implementing high availability features of Oracle Database
  • Ability to spot proactive services that benefit of customer
  • Ability to facilitate issues with Development and Support Strong analytical skills
  • Attention to detail critical, with strong written & oral communication skills
  • Prior experience working in SaaSbusiness model and/or complex data environments within the Insurance industry preferred API development experience is a plus
  • Working with the Agile/Scrum development process is a plus
  • Expertise in building and maintaining scalable and reliable ETL jobs
  • Expertise in data profiling (i.e. source data analysis)
  • Ability to work with various data infrastructuretechnologies, including:
  • SQL relational databases
  • No-SQL data stores (e.g. Redis/Cassandra)
  • Distributed processing engines (e.g. Spark)
  • Data modeling skills, both relational and non-relational

Key responsibilities:

  • The Candidate will parse large volumes of Retirement Planning data into formats amenable to scientific analysis. For scale, data sets contain
  • 10-20 million participant, and ~50 million participant-years of 401k Plan, procedures, and payments.
  • The Candidate will work closely with our Engineer to understand participant data sets and validate data integrity
  • The Candidate Create highly optimized, scalable, redundant, and
  • distributed software.
  • The Candidate will extend and leverage our current Java based ETL platform
  • The Candidatewill work with our Data Engineers to specify our next generation ETL platform
  • The Candidate will optimize performance of data import and analysis over large data sets