TECLA

Data Engineer in TECLA

Closed job - No longer receiving applicants

This is an opportunity to join a rapidly growing Silicon Valley company early on and contribute to the development of next-generation, game-changing products in the insurance industry. We are a leader in providing the most technologically advanced solutions, dedicated to pushing the boundaries of what’s possible. Our mission is to drive the best claims outcomes for both insurers and the insured by leveraging innovative machine learning, including deep learning and natural language processing. Our models generate key insights and predictions that help claims adjusters make optimal decisions at every stage of the claims process.

Key Responsibilities:

  • Design, develop, maintain, and enhance highly scalable data engineering solutions leveraging AWS services.
  • Design, build, document, and implement scalable pipelines with a strong focus on data quality and reliability.
  • Ingest and transform structured, semi-structured, and unstructured data from multiple sources. Develop enterprise-level ETL/ELT solutions.
  • Innovate and build proprietary algorithms to tackle complex problems involving interesting data challenges.
  • Execute and continually optimize new customer data ingestion and model implementation processes.
  • Integrate business knowledge with technical functionalities.
  • Collaborate closely with application engineers, data scientists, product managers, and product delivery teams.
  • Develop solutions at the intersection of data and ML.
  • Apply best practices for using cloud provider AI services.
  • Contribute in an agile, collaborative, and fast-paced environment.
  • Monitor workflow performance, reliability, and ensure SLA targets are met.
  • Automate existing code and processes using scripting, CI/CD, infrastructure-as-code, and configuration management tools.
  • Troubleshoot and solve problems creatively, thinking outside the box to address common challenges.
  • AI is at the core of our work, and for those interested in AI-driven challenges, there are opportunities to work on NLP, image analysis and featurization, and OCR labeling.

Skills, Knowledge, and Expertise:

  • 3+ years of experience with Python and Scala for data engineering and ETL.
  • 3+ years of experience with data pipeline tools (e.g., Informatica, Spark, Spark SQL), DAG orchestration, and workflow management tools (e.g., Airflow, AWS Step Functions).
  • 3+ years of experience working in the AWS ecosystem or GCP.
  • 3+ years of experience using cloud provider AI services.
  • 3+ years of experience with Kubernetes and developing applications at scale.
  • 3+ years of hands-on experience developing ETL solutions using AWS services, including S3, IAM, Lambda, RDS, Redshift, Glue, SQS, EKS, and ECR.
  • High proficiency in SQL programming with relational databases; experience writing complex SQL queries is a must.
  • Experience working with distributed computing tools (e.g., Spark, Hive).
  • Strong knowledge of software engineering best practices, including version control (Git), CI/CD (Jenkins, GitLab CI/CD, GitHub Actions), automated unit testing, and DevOps.
  • Experience with containerization and orchestration tools (Docker, Kubernetes, Helm).
  • Proven experience in a fast-paced agile development environment.

Nice to Have:

  • AWS certifications (AWS Certified Solutions Architect, Developer, or DevOps).
  • Knowledge of commercial claims management systems.

Benefits:

  • A fully remote position allowing work-life balance.
  • The opportunity to be part of a rapidly growing company shaping the future of the insurance industry.
  • Two weeks of paid vacation per year.
  • 10 paid days for local holidays.

*Please note we are only looking for dedicated full-time team members who are eager to integrate fully within our team.

Fully remote You can work from anywhere in the world.

Remote work policy

Fully remote

Candidates can reside anywhere in the world.

Life's too short for bad jobs.
Sign up for free and find jobs that are truly your match.