Resume Score
CV/Résumé Score
  • Expertini Resume Scoring: See how well your CV/Résumé matches this job: Manager Data Engineer (Scala/Spark + AWS).
United States Jobs Expertini

Urgent! Manager Data Engineer (Scala/Spark + AWS) Job | Publicis Groupe

Manager Data Engineer (Scala/Spark + AWS)



Job description

**Company description**

Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally enabled state, both in the way they work and the way they serve their customers.

We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting, and customer experience with agile engineering and problem-solving creativity.

United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting, and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.

**Overview**

We are looking for a **Manager Data Engineering (Scala/Spark + AWS + Kafka)** to lead a high-performing team in designing, building, and optimizing large-scale data solutions.

The ideal candidate will have a strong background in data engineering, hands-on recent experience with Scala, and the ability to guide teams in delivering scalable, reliable, and high-quality data pipelines on modern platforms.

You’ll be joining a **large-scale data modernization project within the hospitality industry** , focused on migrating a legacy reservation platform to a modern, cloud-native architecture.

This initiative is one of the most strategic and long-standing Data Engineering programs at Publicis Sapient, offering the opportunity to work on complex data challenges with global impact and cross-regional collaboration.

**Responsibilities**

**Your Impact**

+ Lead and inspire a team of data engineers, providing technical guidance, mentorship, and strategic direction to ensure the delivery of scalable, high-quality data solutions aligned with business goals.
+ Oversee the end-to-end design and implementation of data pipelines and platforms — from ingestion and transformation to modeling and analytics — ensuring reliability, performance, and maintainability.
+ Drive architecture and modernization initiatives, reverse-engineering legacy SQL and Scala code where necessary to establish best practices and technical excellence across the team.
+ Define data strategy and governance frameworks, implementing standards for data quality, lineage, and security across the organization.
+ Partner cross-functionally with software engineers, data scientists, architects, and business leaders to align on priorities, translate requirements into scalable solutions, and promote data-driven decision-making.
+ Foster a culture of continuous improvement, leveraging automation, observability, and performance monitoring to enhance data reliability and operational efficiency.
+ Identify and anticipate risks or bottlenecks, implementing proactive solutions that ensure project continuity and long-term platform scalability.

**Qualifications**

**Your Skills and Experience**

+ 8+ years of experience in Data Engineering, with proven expertise in designing and implementing large-scale data solutions.
+ Demonstrated experience as a Technical Lead, guiding teams of data engineers and ensuring delivery quality.
+ Strong exposure to streaming processing projects and experience with Kafka.
+ Recent hands-on experience with Scala and strong knowledge of Apache Spark.
+ Solid expertise in SQL (Postgres) and no-SQL (MongoDB - DocumentDB) with the ability to reverse-engineer SQL queries and Scala code to assess and optimize functionality.
+ Strong exposure to AWS Data Services (S3, Lambda, Glue, Kinesis, EMR, Redshift, DynamoDB)
+ Proven experience in database design, data modeling, data mining, and segmentation.
+ Strong analytical skills with the ability to identify trends, patterns, and insights in complex data sets.
+ Technical proficiency across data extraction, cleaning, and transformation using automated tools.
+ Proficiency in additional programming languages such as Python.

**Set Yourself Apart With**

+ Familiarity with containerization and cloud-based deployments.
+ AWS certifications

**Additional information**

**This is a full-time role (40 hours per week) open to both permanent and contractor hires in Mexico, Colombia, and Costa Rica, and available only as a contractor position in Brazil.**


Required Skill Profession

Other General



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Manager Data Potential: Insight & Career Growth Guide


Advance your career or build your team with Expertini's smart job platform. Connecting professionals and employers United States.