Be Part Of A High-Performing Global Security Data Operations Team:
This opportunity sits within a global financial institution’s Information Security Data Operations organization, supporting enterprise-wide data initiatives across a modern cloud-based Data Lakehouse platform. The team plays a critical role in centralizing and governing cybersecurity and information security data, ensuring accuracy, integrity, and reliability across complex ETL pipelines.
Operating within a highly regulated financial services environment, the team partners closely with IT, Data Engineering, and Chief Data & Analytics Office (CDAO) stakeholders across regions. The work directly supports enterprise risk management, cyber intelligence, and data governance initiatives. Collaboration across global teams—including Japan-based stakeholders—requires strong bilingual communication and the ability to operate effectively in cross-cultural technical environments.
What’s In Store For You:
- High-visibility role supporting enterprise cybersecurity data initiatives
- Opportunity to architect and build QA frameworks from the ground up in Azure Data Lakehouse
- Hands-on ownership of scalable, cloud-based QA ETL validation pipelines
- Exposure to medallion architecture (raw, curated, and target layers)
- Strategic involvement in SDLC process improvements within a regulated banking environment
How You Will Make An Impact:
- Design and develop a comprehensive QA framework within Azure Databricks to validate ETL pipelines across the Data Lakehouse
- Build and maintain QA ingestion pipelines to test data feeds from raw layer through transformation and target layers (medallion architecture)
- Develop automated data quality checks to detect defects in ETL code and data transformations
- Create scalable, flexible QA processes for large-volume cybersecurity datasets
- Optimize cloud-based data storage and validation mechanisms
- Develop Python- and SQL-based validation scripts for complex structured and semi-structured data
- Partner with QA analysts to validate code, data transformations, and integration points as part of the SDLC process
- Serve as a technical liaison between IT and CDAO Data Engineering teams
- Implement mechanisms to auto-generate JIRA tickets when defects or data inconsistencies are detected
- Perform root cause analysis on data discrepancies and coordinate remediation efforts
- Provide routine updates aligned with CyberDW sprint schedules and adhere to internal Agile/SDLC standards
- Deliver bilingual (Japanese/English) communication to global stakeholders regarding QA findings, remediation plans, and sprint progress
Are you an experienced QA Data Engineer ready to lead QA framework development in a cloud data environment?
- 10+ years of experience in software engineering, QA engineering, or data engineering roles
- Strong hands-on experience building QA frameworks for data platforms
- Advanced proficiency in Python and SQL for data validation and automation
- Experience with Azure Databricks and Azure Data Factory
- Experience with AWS Glue or similar cloud data processing tools (preferred but not required)
- Strong understanding of ETL processes and large-scale data ingestion pipelines
- Hands-on experience validating data within medallion architecture (raw, curated, target layers)
- Experience working in Agile SDLC environments
- Ability to design scalable and adaptable QA automation mechanisms
- Bachelor’s degree in Computer Science, Engineering, Information Technology, or related field
Language Requirements:
- Professional fluency in Japanese and English (verbal and written)
- Ability to clearly communicate technical findings and QA results to global stakeholders
- Experience working with cross-border teams, particularly Japan-based technology or business units, preferred
Desired Qualifications:
- Experience supporting cybersecurity or information security data domains
- Familiarity with enterprise data governance and regulatory environments
- Experience designing automated defect-tracking integrations with JIRA
- Certifications in cloud technologies (Azure, AWS) or QA methodologies