Pyspark Developer

FULL_TIME 3 weeks ago
Employment Information

Role & responsibilities

  • Strong proficiency in Python programming.
  • Hands-on experience with PySpark and Apache Spark.
  • Knowledge of Big Data technologies (Hadoop, Hive, Kafka, etc.). Experience with SQL and relational/non-relational databases.
  • Familiarity with distributed computing and parallel processing. Understanding of data engineering best practices.
  • Experience with REST APIs, JSON/XML, and data serialization.
  • Exposure to cloud computing environments.
  • 5+ years of experience in Python and PySpark development.
  • Experience with data warehousing and data lakes.
  • Knowledge of machine learning libraries (e.g., MLlib) is a plus.