Developer - Specialty 3 V R Della Infotech Inc San Ramon, CA

Kate

Administrator
Команда форума
Job Title: Developer - Specialty 3(12293469)

Location: Hybrid - CA

Duration: 6 Months Contract

Intake Call Notes:

Job Title: Big Data Developer

Location: Hybrid within California (San Ramon, CA) so starts with remote within BOW footprint and later needs to onsite and they needs to be onsite 2-3 days in a Week

Duration: 6 Months Contract (Possible extension)

Laptop: Own Laptop (VDI will be provided by the client)

Interview Process: 2 round Video call / 45 60 mins each

1st round coding test (Make sure the candidates have the good laptop for interview)

Will not consider candidates without LinkedIn ID

Project: Data Hub document respiratory Project

Must Have Skills:
  • Overall, 5+ years of experience on Big Data Technologies
  • Python
  • Spark
  • SQL
  • Strong Development on Hadoop Eco system & ETL
  • Cloudera Cluster.
Nice to have Skills:
  • Kafka, JAVA
  • Devops Exp
  • Apache tools(Nifi, Kafka and Airflow)
  • Banking/Finance Exp nice to have
Job Description:

Day to Day Responsibilities of this Position and Description of Project:

Summary: seeking a Developer (Other Specialty) to design, develop, and implement applications using in-demand languages and technologies (e.g. - Java, Websphere, Informatica etc.) to support business requirements.

Job Responsibilities: *Analyze highly complex business requirements; generate technical specifications to design or redesign complex software components and applications. *Act as an expert technical resource for modeling, simulation and analysis efforts. *Leverage industry best practices to design, test, implement and support a solution. *Assure quality security and compliance requirements are met for supported area. *Be flexible and thrive in an evolving environment. *Adapt to change quickly and adjust work accordingly in a positive manner. Qualifications: *Bachelor's degree in a technical field such as computer science, computer engineering or related field required. *5-7 years experience required *Development experience in needed language or technology (e.g. - Websphere, Informatica etc.). *Hands on experience in designing, developing and successful deployment of large scale projects from end-to-end. *Hands on experience in following the iterative and agile SDLC.

Hybrid role - With RTO needs to be 2-3 days in office Build big data pipelines and structures the large scale banking systems Implement and manage large scale ETL jobs on Hadoop/Spark clusters in Cloudera, Hortonworks platforms Interface with internal teams and data consumer teams to understand the data needs Own data quality throughout all stages of acquisition and processing, including data collection, ETL/wrangling, and normalization 4+ years of experience working with large data sets using open source technologies such as Spark, Hadoop, Kafka on one of the major big data stack Cloudera, Hortonworks, and any other cloud systems like EMR Strong SQL (, Hive, MySQL, etc) and No-SQL (HBase, etc.) skills, including writing complex queries and performance tuning Must have good command of Python, Spark and big data techniques (Hive/Pig, MapReduce, Hadoop streaming, Kafka) Excellent communication, relationship skills and a strong team player. Preferred Qualifications Experience developing and productizing real-world large scale data pipelines Experience with Apache Airflow, Apache NiFi, Kafka, Apache Atlas, Schema Registry Experience with DEVOPS, like GitLab, GitHub, Ansible and automation Expertise in Python scripting, Java is a big plus.
 
Сверху