Experience : 10 years
• Analyze database structures and models to identify data integrity and performance.
• Design, implement, and tune tables, queries, stored procedures, and indexes in OLTP and OLAP environments.
• Develop and maintain enterprise-wide (domain) data models.
• Serves as a data resource for the organization.
• Gathers and analyzes data supporting business cases, projects and systems requirements
• Design and develop logical and physical database models.
• Design models for Self-Services Analytics and Reporting
• Collaborates with both internal and external teams to identify and validate data structures and fields along with defining build requests for new reports, dashboards and ML models.
• Prepare report datasets for programs, management, and various requirements on time
• Supports the preparation and evaluation of special data projects.
• Utilize ETL data methods (Extraction, Transform, and Loading) to bring disparate data sources together into the correct format required by applications.
• Review data for quality assurance discrepancies and communicate issues to supervisor and affected staff as needed.
• Provide support for reporting issues or failures at any time.
• Assemble large, complex data sets that meet functional and technical requirements.
• Work with stakeholders, including the business analytics teams and application architecture teams, to assist with data-related technical issues and support their data infrastructure needs.
• Work with geographically dispersed teams, embracing Agile and DevOps strategies for themselves and others while driving adoption to enable greater technology and business value .
• Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), as well as experience with complex Stored Procedures for data processing.
• Advanced data modeling skills of both OLAP and OLTP databases
• Experience building and optimizing data pipelines, architectures and datasets.
• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• Experience performing SQL performance tuning and optimization.
• Experience supporting and working with cross-functional teams in a dynamic environment.
• Knowledge of SQL, PL/pgSQL, including stored procedures, functions, triggers, and views
• Knowledge of JSON and PostgreSQL support for JSON
• Ability to efficiently write database code without compromising data quality, privacy or security
• Knowledge of database design principles, query optimization, index management, integrity checks, statistics and isolation levels
• Familiarity with shell script, Python
• Able to handle multiple tasks in a fast-paced environment.
• Excellent verbal, written, and interpersonal communication skills.
• Ability to evaluate alternative solutions and/or workarounds.
• Desire to learn and adopt new tools and technologies as required by the project.
• 5+ years’ experience building large-scale operational data stores in DB2, Postgres, Oracle, MS SQL Server or equivalent relational database.
Candidates should also have 10+ years of experience using the following software or tools:
• Experience with IBM DB2, Postgres, AWS Aurora, Snowflake are plus
• Experience with relational SQL and NoSQL databases, including Oracle, MongoDB and Azure SQL
• Experience with AWS cloud services
• Experience with stream-processing systems: Event, Hubs, Kafka, Spark-Streaming are plus
• Experience with Python is a plus.
• Experience with Big Data Warehouses (MPP) are plus.
• 7+ years of hands-on experience in data modeling, model driven engineering and design patterns
• 5+ years of hands-on experience in Data Lakes and Data warehousing, messaging, distributed Data architectures, and establishing Data platforms to support complex Analytical usage, including Operational ML use cases.
• Implement and maintain database code in the form of stored procedures, scripts, queries, views, triggers, etc.
• Work with front end developers to define simple yet powerful APIs.
• Work to ensure efficiency of database code, integrity of data structures and quality of data content.
• Work with product managers to ensure database code meets requirements.
• Work to ensure database code is accurately documented.
• Work to maintain Postgraphile environment.
• Participate as a member in Agile teams, and work with other team members to review user stories, estimate effort to build functionality supporting user stories, and participate in sprint reviews.