Databricks Developer

FULL_TIME 3 weeks ago
Employment Information

Skills and Experience Required


  • 3+ years of hands-on experience working with Databricks, Apache Spark, or related big data platforms
  • Proficiency in programming languages such as Python, Scala, or SQL for data transformation and pipeline development
  • Hands-on experience with Spark SQL, Delta Lake, Lakeflow Spark Declarative Pipelines and Unity Catalog
  • Strong understanding of ETL/ELT workflows, data modeling, and data warehousing concepts
  • Experience with cloud platforms (Azure, AWS, or GCP)
  • Familiarity with CI/CD practices and version control tools such as Git
  • Knowledge of data governance, data quality, and security best practices
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work collaboratively in an agile team environment
  • Databricks certifications are preferred

Responsibilities


  • Implement data ingestion, transformation, and integration processes from various structured and unstructured data sources
  • Develop, maintain, and optimize scalable data pipelines and workflows on the Databricks platform
  • Collaborate with analysts and business stakeholders to understand data requirements and deliver solutions
  • Ensure data reliability, quality, and security across all data engineering processes
  • Troubleshoot, debug, and resolve data pipeline issues and performance bottlenecks
  • Document data engineering workflows and technical designs
  • Monitor and maintain data pipeline health, proactively addressing any potential issues
  • Stay up-to-date with emerging data engineering technologies and best practices, especially within the Databricks ecosystem