POSITION SUMMARY
As a Data Quality Analyst, you will play a critical role in ensuring the reliability, accuracy, and consistency of enterprise data by building and maintaining automated data quality checks, monitoring pipelines, identifying anomalies, and resolving data issues at the source.
This role is essential to support trusted analytics and downstream decision-making across the organization.
RESPONSIBILITIES
- Develop and Maintain Data Quality Frameworks: Design and implement automated data validation checks, profiling rules, and anomaly detection processes across datasets from multiple systems and sources.
- Perform Data Quality Audits and Root Cause Analysis: Proactively identify and resolve data inconsistencies, duplication, data drift, and schema issues.
Perform root-cause analysis to trace data quality issues back to source systems or transformation processes.
- Enhance Data Pipelines and ETL Processes: Work closely with engineering and data teams to ensure data ingestion and transformation processes maintain integrity, completeness, and timeliness standards.
Support testing and validation for pipeline changes.
- Collaborate with Stakeholders: Partner with business, IT, and analytics teams to understand critical data quality requirements and to prioritize resolution of data defects that impact key reporting and operational systems.
- Document Data Definitions and Quality Rules: Maintain metadata, data dictionaries, and technical documentation on data lineage, validation logic, and quality scoring.
- Establish Monitoring and Alerting: Implement proactive data quality monitoring dashboards, metrics, and alerting systems to surface real-time issues with production data.
- Support Continuous Improvement and Governance Initiatives: Lead or participate in data stewardship, master data governance, and quality improvement initiatives under Global Business Services programs.
- Tools & Testing: Leverage SQL, Python, or other tools for custom rule building, testing, and diagnostics.
Build reusable quality test suites and participate in test automation efforts.
QUALIFICATIONS
- Bachelor’s degree in Information Technology, Computer Science, Mathematics, Engineering, or related field
- 1-3 years of experience in a data quality, data engineering, or related role within a multinational environment
- High proficiency in SQL (MSSQL, Oracle, Impala, etc.) and related programming extensions
- Working knowledge of scripting/programming languages (Python preferred) for building custom validation rules or quality automation
- Familiarity with ETL tools and concepts
- Knowledge of data governance, master data management (MDM), and metadata best practices
- Detail-oriented mindset with a passion for ensuring data accuracy and completeness
- Strong communication and collaboration skills with the ability to work cross-functionally
- Strong documentation and reporting skills to communicate findings and improvement opportunities
The successful candidate will embrace Vertiv’s Core Principals & Behaviors to help execute our Strategic Priorities.
OUR CORE PRINCIPALS: Safety, Integrity, Respect, Teamwork, Diversity & Inclusion.
OUR STRATEGIC PRIORITIES
Customer Focus
Operational Excellence
High-Performance Culture
Innovation
Financial Strength
OUR BEHAVIORS
Own It
Act With Urgency
Foster a Customer-First Mindset
Think Big and Execute
Lead by Example
Drive Continuous Improvement
Learn and Seek Out Development
At Vertiv, we offer the stability of a global leader in a growing industry and the opportunity of a startup.
We design, manufacture and service the mission-critical infrastructure technologies for vital applications in data centers, communication networks and commercial and industrial environments.
With $5 billion in sales, a strong customer base and global reach in nearly 70 countries, our move to establish a standalone business enables us to deliver greater value to our customers and create new opportunities for our people.