Контакты

Big Data Developer

Pittsburgh, PA, USA

Ranked as #12 on Forbes’ List of 25 Fastest Growing Public Tech Companies for 2017, EPAM is committed to providing our global team of 25,900+ EPAMers with inspiring careers from day one. EPAMers lead with passion and honesty and think creatively. Our people are the source of our success and we value collaboration, try to always understand our customers’ business, and strive for the highest standards of excellence. No matter where you are located, you’ll join a dedicated, diverse community that will help you discover your fullest potential.

DESCRIPTION


You are curious, persistent, logical and clever – a true techie at heart. You enjoy living by the code of your craft and developing elegant solutions for complex problems. If this sounds like you, this could be the perfect opportunity to join EPAM as a Big Data Developer. Scroll down to learn more about the position’s responsibilities and requirements.

EPAM’s Financial Services Practice is looking for exceptionally talented people to join our team of world-class engineers. We are looking for a Big Data Engineer that will work on the collecting, storing, processing, and analyzing of large sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company.

What You’ll Do

  • Collect, store, process, and analyze large sets of data;
  • Build creative solutions to maintain, implement, and monitor data;
  • Integrate your solutions with the architecture used company-wide.

What You Have

  • Proficient understanding of distributed computing principles;
  • Knowledge of Hadoop cluster (Cloudera preferred) with included services;
  • Ability to take on ongoing issues with operating the cluster;
  • Proficiency with Hadoop v2, MapReduce, HDFS, Sqoop;
  • Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming;
  • Solid knowledge of Data Science querying tools, such as Pig, Hive, and Impala;
  • Experience with Spark;
  • Experience with integration of data from multiple data sources such as MsSQL Server, Oracle;
  • Understanding of SQL queries, joins, stored procedures, relational schemas;
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB (preferred);
  • Knowledge of various ETL techniques and frameworks, such as Flume;
  • Experience with various messaging systems, such as Kafka or RabbitMQ;
  • Experience with Cloudera FS domain knowledge is a huge plus.

What We Offer

  • Medical, Dental and Vision Insurance (Subsidized);
  • Health Savings Account;
  • Flexible Spending Accounts (Healthcare, Dependent Care, Commuter);
  • Short-Term and Long-Term Disability (Company Provided);
  • Life and AD&D Insurance (Company Provided);
  • Matched 401(k) Retirement Savings Plan;
  • Paid Time Off;
  • Legal Plan and Identity Theft Protection;
  • Accident Insurance;
  • Employee Discounts;
  • Pet Insurance.