Ranked as #12 on Forbes’ List of 25 Fastest Growing Public Tech Companies for 2017, EPAM is committed to providing our global team of over 24,000 people with inspiring careers from day one. EPAMers lead with passion and honesty, and think creatively. Our people are the source of our success and we value collaboration, try to always understand our customers’ business, and strive for the highest standards of excellence. No matter where you are located, you’ll join a dedicated, diverse community that will help you discover your fullest potential.
You are curious, persistent, logical and clever – a true techie at heart. You enjoy living by the code of your craft and developing elegant solutions for complex problems. If this sounds like you, this could be the perfect opportunity to join EPAM as a Machine Learning Enginer. Scroll down to learn more about the position’s responsibilities and requirements.
Looking for engineers with the skill set to build innovative software applications;
Designing innovative solutions while playing a hands-on development role to deliver products in a rapid and dynamic environment;
Involvement innovative solutions to given challenges;
Develop tools to monitor system health, performance, and reliability;
Responsible for the operations of very high loaded data infrastructure;
Manage and monitor the legacy systems operations and enhance monitoring to ensure SLA compliance;
Participate in regular scrum meetings.
Bachelor’s degree in Computer Science and 7-10+ years of development experience, OR Masters in CS, with 5+ years of experience;
The position requires a solid knowledge of secure coding practices and experience with open source technologies;
Experience in distributed systems, design and implementation of high throughput, low latency applications;
Extensive hands-on experience building solutions for large-scale internet infrastructure;
Strong Object-Oriented Programming skills and senior level development experience/proficiency in Java;
Thorough understanding of TCP, web sockets, and libraries like Netty;
Experience developing ETL solutions for clients in different domains dealing with various types of challenging data;
Solid understanding of Spark, HDFS, Hive, ORC and Presto;
Experience architecting distributed systems, concurrent programming, and coding data structures;
Solid understanding of Hadoop and NoSQL technologies like Cassandra;
Problem solving skills, critical thinking, and communication skills;
Ability to learn new technologies in a short time;
Development and implementation experience on large scale mission critical applications;
Self-motivated, proactive, and a solution-oriented developer.