DevOps engineer(W2 Contract role) EtekIT Chicago, IL Contractor $124,465.00 - $148,750.00 / year

Kate

Administrator
Команда форума
  • Bachelors Degree or Masters in Computer Science, Engineering, Software Engineering or a relevant field.
  • At least 3 years of experience as a DevOps Engineer
  • Strong experience with Linux-based infrastructures, Linux/Unix administration, and AWS.
  • Knowledge of scripting languages such as Java, JavaScript, Perl, Python, Groovy, Bash.
  • Experience with project management and workflow tools such as Agile, Jira, Scrum/Kanban, etc.
  • Experience with open-source technologies and cloud services.
  • Strong communication skills and ability to explain protocol and processes with team and management.
  • Sailpoint/IdentityIQ troubleshooting skills is a plus
  • Troubleshoot and debug any hadoop ecosystem
  • Assess the quality of datasets for a hadoop data lake
  • Troubleshoot and debug any hadoop ecosystem run time issues
  • Develop efficient pig and hive scripts with joins on datasets using various techniques
  • Execute in the design/development of data ingestion and data transformation for big data applications
  • Tuning Hadoop jobs (hive, pig)
  • Move data from one cluster
  • Troubleshoot issues with hive, Hbase
  • Execute in the design/development of data ingestion and data transformation for big data applications
  • Enabling the next generation customer analytical platform using big data technologies
  • Lead the design and implementation of sustainable tools & processes to support the big data ecosystem
  • Support the big data ecosystem
  • Make sure that all the big data applications are
  • Troubleshoot technical and performance issues in the big data ecosystem
  • Expect that from a big payments company
  • Creating big data and analytics solutions for Digital channel
  • Perform data profiling on big data distributed computing environments, relational databases and XML
  • Help architect and build a big data platform used for analytics
  • Assist in data analysis and data modeling
  • Leading the implementation and support of leading-edge, data warehouse / big data environments

Recommended Skills​

Amazon Web Services
Apache Hadoop
Big Data
Apache Hive
Data Ingestion
Perl (Programming Language)
 
Сверху