Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: ETL Architect with GCP.
United States Jobs Expertini

Urgent! ETL Architect with GCP Job Opening In San Jose – Now Hiring GovServicesHub

ETL Architect with GCP



Job description

Role-ETL Architect with GCP Location-San Jose, CA Experience 15+ years Data Engineering experience.

5+ years experience of cloud platform services (preferably GCP) MUST Have: Ex-Goole candidates.

Responsibilities Design and develop various standard/reusable to ETL Jobs and pipelines.

Work with the team in extracting the data from different data sources like Oracle, cloud storage and flat files.

Work with database objects including tables, views, indexes, schemas, stored procedures, functions, and triggers.

Work with team to troubleshoot and resolve issues in job logic as well as performance.

Write ETL validations based on design specifications for unit testing Work with the BAs and the DBAs for requirements gathering, analysis, testing, metrics and project coordination.

Requirements Good knowledge and understanding of cloud based ETL framework and tools.

Good understanding and working knowledge of batch and streaming data processing.

Good understanding of the Data Warehousing architecture.

Knowledge of open table and file formats (e.g. delta, hudi, iceberg, avro, parquet, json, csv) Strong analytic skills related to working with unstructured datasets.

Excellent numerical and analytical skills.

Technical Skills Programming & Languages: JAVA Database Tech: Oracle, Spanner, BigQuery, Cloud Storage Operating Systems: Linux Hands-on experience in building and optimizing data pipelines and data sets.

Hands-on experience in migrating integrations from one server to the other Hands on experience in Dev Ops implementation Strong Testing and Debugging Skills - writing unit tests and familiarity with the tools and techniques to fix issues.

DevOps knowledge - CI/CD practices and tools.

Hands-on CI/CD experience in automating the build, test, and deployment processes to ensure rapid and reliable delivery of API updates.

Hands-on experience with data extraction and transformation tasks while taking care of data security, error handling and pipeline performance.

Hands-on experience with relational SQL (Oracle, SQL Server or MySQL) and NoSQL databases .

Advance SQL experience - creating, debugging Stored Procedures, Functions, Triggers and Object Types in PL/SQL Statements.

Hands-on experience with programming languages - Java (mandatory), Go, Python.

Hands-on experience in unit testing data pipelines.

Experience in using Pentaho Data Integration (Kettle/Spoon) and debugging issues.

Experience supporting and working with cross-functional teams in a dynamic environment

5+ years
Good knowledge and understanding of cloud based ETL framework and tools.

Good understanding and working knowledge of batch and streaming data processing.

Good understanding of the Data Warehousing architecture.

Knowledge of open table and file formats (e.g. delta, hudi, iceberg, avro, parquet, json, csv) Strong analytic skills related to working with unstructured datasets.

Excellent numerical and analytical skills.

Technical Skills Programming & Languages: JAVA Database Tech: Oracle, Spanner, BigQuery, Cloud Storage Operating Systems: Linux Hands-on experience in building and optimizing data pipelines and data sets.

Hands-on experience in migrating integrations from one server to the other Hands on experience in Dev Ops implementation Strong Testing and Debugging Skills - writing unit tests and familiarity with the tools and techniques to fix issues.

DevOps knowledge - CI/CD practices and tools.

Hands-on CI/CD experience in automating the build, test, and deployment processes to ensure rapid and reliable delivery of API updates.

Hands-on experience with data extraction and transformation tasks while taking care of data security, error handling and pipeline performance.

Hands-on experience with relational SQL (Oracle, SQL Server or MySQL) and NoSQL databases .

Advance SQL experience - creating, debugging Stored Procedures, Functions, Triggers and Object Types in PL/SQL Statements.

Hands-on experience with programming languages - Java (mandatory), Go, Python.

Hands-on experience in unit testing data pipelines.

Experience in using Pentaho Data Integration (Kettle/Spoon) and debugging issues.

Experience supporting and working with cross-functional teams in a dynamic environment


Required Skill Profession

Other General



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your ETL Architect Potential: Insight & Career Growth Guide