Job Summary
The Data Quality Analyst will partner with TCW’s Data Engineering team to enhance data integrity across critical enterprise platforms. The role will focus on designing, implementing, and monitoring data quality rules, preferably leveraging Ataccama DQ, to ensure that TCW’s data assets meet accuracy, completeness, consistency, and timeliness standards. This resource will also collaborate with data engineers, business analysts, and governance teams to remediate data issues and embed quality checks into data pipelines.
Key Responsibilities
- Design, build, and maintain data quality rules within Ataccama DQ (required).
- Translate business data quality requirements into technical rule logic.
- Parameterize rules to support scalability and reusability.
- Perform profiling on datasets to detect anomalies, missing values, duplicates, and inconsistencies.
- Generate reports highlighting quality gaps and recommend remediation actions.
- Set up dashboards and alerts to track rule performance and compliance over time.
- Collaborate with Data Engineering to embed rules in ETL/ELT pipelines.
- Investigate failed data quality checks and partner with upstream/downstream teams to resolve issues.
- Document data defects, root causes, and resolution paths.
- Work with Data Governance and Business SMEs to define data standards and stewardship processes.
- Contribute to metadata documentation and lineage tracking.
Required Skills & Experience
- 5+ years in data quality, data management, or data engineering roles.
- Strong experience with data quality tools (Ataccama required).
- Proficiency in SQL for rule writing, data profiling, and validation.
- Experience with ETL/ELT workflows and modern data platforms (Snowflake, Databricks, Azure preferred).
- Familiarity with data governance frameworks and best practices.
- Strong problem-solving, analytical, and communication skills.
Preferred Skills
- Prior experience in asset management or financial services data environments.
- Knowledge of reference data, market data, and client data domains.
- Experience with automation of DQ workflows using Python, Spark, or other scripting languages.
- Exposure to cloud-based architectures (Azure, AWS).
Deliverables / Success Metrics
- Implementation of baseline data quality rules across prioritized datasets (e.g., client, security master, transaction data).
- Monthly reporting of rule performance, exceptions, and remediation status.
- Documented library of data quality checks mapped to business requirements.
- Demonstrated reduction in recurring data quality issues and improved SLA adherence for downstream consumers.
Share the profiles with vishnu.gadila@cesltd.com
