Dear Associates,
Our client is looking for a Data Engineer, Philadelphia, PA (Remote), full-time, read the below job details, if you are comfortable share your updated resume.
You will be on Panamax's W2 and work as a contractor with the Client and this is a long-term project for 12 + months.
Role: BIG DATA DEVELOPER WITH SPARK/AWS/KAFKA
Location: Philadelphia, PA
Duration: Long Term
Primary Skills: Spark, Big Data, AWS
Education: Bachelors degree in Computer Science, Electrical/Electronic Engineering, Information Technology or another related field or Equivalent.
Experience: Around 8 years
Travel: No
Local Preferred: Yes
Note: Must be able to work on a W2 basis
1.Must have knowledge on Spark/Spark SQL- Spark Core, Spark SQL, Data Frames, Spark Streaming, Driver Node, Worker Node, Stages, Executors and Tasks in Data bricks for ETL.
2.Data Bricks CLI, API, Streaming, Notebook Preparation, Job submission, Data Bricks DevOps Automation.
3.Knowledge in Python/Scala.
4.Knowledge on HDFS, Hive. Pig, Scoop, Spark, Kafka
5.Real-Time and Batch Processing
6. AWS-Knowledge of Architecture and Offerings besides general Cloud concepts
7.Must have solid CLI and console hands on with IAM, S3, EC2, Lambda
8.ETL Data Pipeline
Our client is looking for a Data Engineer, Philadelphia, PA (Remote), full-time, read the below job details, if you are comfortable share your updated resume.
You will be on Panamax's W2 and work as a contractor with the Client and this is a long-term project for 12 + months.
Role: BIG DATA DEVELOPER WITH SPARK/AWS/KAFKA
Location: Philadelphia, PA
Duration: Long Term
Primary Skills: Spark, Big Data, AWS
Education: Bachelors degree in Computer Science, Electrical/Electronic Engineering, Information Technology or another related field or Equivalent.
Experience: Around 8 years
Travel: No
Local Preferred: Yes
Note: Must be able to work on a W2 basis
1.Must have knowledge on Spark/Spark SQL- Spark Core, Spark SQL, Data Frames, Spark Streaming, Driver Node, Worker Node, Stages, Executors and Tasks in Data bricks for ETL.
2.Data Bricks CLI, API, Streaming, Notebook Preparation, Job submission, Data Bricks DevOps Automation.
3.Knowledge in Python/Scala.
4.Knowledge on HDFS, Hive. Pig, Scoop, Spark, Kafka
5.Real-Time and Batch Processing
6. AWS-Knowledge of Architecture and Offerings besides general Cloud concepts
7.Must have solid CLI and console hands on with IAM, S3, EC2, Lambda
8.ETL Data Pipeline