felixchen1314: 福利:每周 3 天 WFO ,2 天 WFH ,20 天年假
JD 如下:
Job Summary
The Principal Data Framework Engineer develops reusable frameworks and tools that enable teams to build pipelines and workflows efficiently. This role focuses on designing metadata-driven solutions for ETL, workflows, and data specifications, optimizing data integration processes, and ensuring seamless interaction with the platform’s core components. The Framework Engineer also implements robust data quality and integrity checks, enabling a scalable and flexible framework for data engineering tasks across the organization.
Strategy
• Framework Development: Design and implement flexible, metadata-driven frameworks for ETL, workflow orchestration, and data specifications to support efficient pipeline creation.
• Standardization: Establish best practices and reusable components for data ingestion, transformation, and processing to streamline efforts across teams.
• Metadata-Driven Design: Build frameworks leveraging metadata to define and automate ETL processes, workflows, and data specifications, ensuring flexibility and scalability.
• Data Quality Checks: Develop and integrate automated data quality checks into frameworks using tools like Great Expectations or similar solutions.
• Data Integrity Checks: Ensure frameworks include mechanisms to validate data integrity across pipelines, capturing schema changes, record counts, and column-level validations.
• Integration: Build seamless integrations with core platform components like Apache Iceberg, Spark, Kafka, Airflow, and Argo to enable consistent processing across batch and streaming workloads.
• Workflow Orchestration: Develop metadata-driven workflow orchestration frameworks to enable easy configuration and deployment of ETL processes.
• Reliability: Build frameworks with fail-safe mechanisms to handle errors and ensure reliability in pipeline execution.
• Compliance: Ensure frameworks adhere to governance policies and regulatory requirements
Key Responsibilities
Business
• Enablement: Empower data engineers and analysts with user-friendly, metadata-driven frameworks to accelerate development and ensure consistency.
• Stakeholder Alignment: Collaborate with domain and platform teams to ensure frameworks address key business and technical requirements.
People & Talent
• Collaboration: Work closely with platform and data engineers to ensure frameworks align with infrastructure capabilities and operational requirements.
• Skill Sharing: Provide technical documentation and training to engineers and analysts using the frameworks.
Governance
• Compliance & Regulatory Adherence: Ensure compliance with internal and external regulatory requirements, including data sovereignty, privacy, and resilience standards.
• Data Governance & Quality: Implement data governance policies implementation to maintain data quality, integrity, accuracy, and consistency across the platform. Implement monitoring tools to proactively address data quality issues.
Regulatory & Business Conduct
• Display exemplary conduct and live by the Group’s Values and Code of Conduct.
• Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct.
• Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters.
Skills and Experience
Key stakeholders
• Rest of Data & Analytics team
• WRB/CIB/GF Architecture teams
• Business and Functions Data & Analytics Team
Other Responsibilities
• Embrace and practise SCB’s brand promise of Here for Good and corporate values
• Perform other responsibilities assigned under Group, Country, Business or Functional policies and procedures
• Responsible for building a culture of good conduct
Qualifications
• Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
• Expertise in Python, Java, with a focus on building reusable frameworks.
• Experience with metadata-driven framework design for ETL, workflow orchestration, and data specifications.
• Proficiency in data quality and integrity tools like Great Expectations or similar solutions.
• Hands-on experience with Apache Spark, Kafka, and orchestration tools like Airflow or Argo.
• Familiarity with governance tools like Apache Ranger, OpenLineage, and DataHub.
Required skills
check
Framework Development
check
Database Development
check
Data Quality
check
Data Security
check
In-Fusion
check
Process Integration
check
System Reliability
check
Legal Compliance