Get AI‑powered advice on this job and more exclusive features.
Direct message the job poster from CodeRoad Inc
Talent Acquisition Specialist | Experienced in LATAM markets
About the Team
At Coderoad, we’re more than just a software development company — we’re your gateway to the global tech world.
Whether you’re looking to skill up or level up your career, we offer the challenges you’ve been searching for.
We provide end‑to‑end software development services and give you the opportunity to work on exciting, real‑world projects in a supportive environment.
Whether it’s staff augmentation, dedicated IT teams, or general software engineering, we have opportunities for everyone to challenge themselves and take their career to the next level.
Our client is transforming customer communication and engagement for dealerships across North America.
Their platform empowers automotive and powersport dealerships to text customers, gather reviews, collect payments, generate leads, and manage online reputation — all powered by AI .
Endorsed by major OEMs and integrated with Dealer Management Systems (DMS) and Customer Relationship Management (CRM) platforms, they’re driving a revolution in customer experience across every touchpoint.
Now, they’re expanding their offerings to deliver end‑to‑end customer journey management — helping dealerships connect more effectively, optimize marketing spend, boost conversions, and build lasting loyalty.
About the role
As the Lead Data Engineer, you will lead the development of our data ecosystem, including architecting and implementing a customer data platform.
This platform will unify batch and real‑time data from various sources such as DMS, CRMs, social networks, and web traffic to create comprehensive 360‑degree customer profiles.
You will collaborate closely with a Product Manager to ensure that data initiatives align with the product roadmap—prioritizing data products that deliver customer‑facing value.
Working alongside engineers, analysts, and data scientists, you’ll enable data‑driven insights and power advanced digital initiatives such as real‑time personalization, predictive modeling, and targeted marketing campaigns.
This role involves defining architecture, tools, and strategy for data pipelines, data lakes, cloud data warehousing, and integrations with marketing automation tools.
Location: LATAM
Time Zone: Team operates on U.S. East/West Coast hours
How You’ll Make an Impact
- Lead the design and implementation of a scalable Customer Data Platform (CDP) integrating batch and real‑time data from multiple sources (DMS, CRMs, social networks, web traffic).
- Design and develop data pipelines using tools like Fivetran , dbt , Apache Spark , and Airflow to ensure high‑performance, reliable data flow.
- Architect and manage data lakes and cloud data warehouses (e.g., BigQuery, Redshift, Snowflake, Databricks).
- Build and optimize streaming pipelines with GCP Pub/Sub , Kafka , or AWS SNS to support real‑time analytics and personalization.
- Develop integrations with Customer Data Platforms (CDPs) such as Twilio Segment or RudderStack and connect with marketing automation systems.
- Collaborate with Product, Engineering, and Data Science teams to deliver data‑driven insights powering personalization, predictive modeling, and segmentation.
- Ensure strong alignment between data initiatives and the product roadmap , prioritizing data products that deliver measurable customer impact.
- Define architecture and strategy for multi‑tenant SaaS data solutions , ensuring scalability, reliability, and data security.
- Drive adoption of modern technologies and best practices in data engineering , promoting a culture of technical excellence.
- Maintain high standards for data quality, governance, and compliance with industry regulations.
What We’re Looking For
- 7+ years of professional experience in data engineering , with deep expertise in both batch and streaming data architectures.
- Proven experience building scalable ETL/ELT pipelines and distributed data processing frameworks.
- Strong proficiency in Python and modern ETL tools (GCP Cloud Dataflow, Fivetran, dbt, Apache Spark).
- Hands‑on experience with cloud‑based data platforms such as BigQuery, Snowflake, Redshift, or Databricks.
- Solid understanding of Customer Data Platforms (CDPs) like Twilio Segment or RudderStack.
- Experience with streaming technologies (GCP Pub/Sub, Kafka, AWS SNS) is highly desirable.
- Background in multi‑tenant SaaS architectures is a strong plus.
- Excellent communication skills and a collaborative, problem‑solving mindset.
What You’ll Love
- USA Contractor
- 100% RemoteHolidays Off
- Paid Time Off
- Health Insurance Assistance Program
- Competitive Pay (USD)
- Excellent teamwork and work environment
- Training and upskilling opportunities
- High‑impact role shaping the company’s data ecosystem and AI capabilities
- Work with a modern data stack and innovative, cross‑functional team
#J-18808-Ljbffr