Luxoft is an emerging global leader in high-end software application development.
Luxoft is a global IT outsourcing and software development company with 7000 employees across 19 locations. It was recently recognized as Leading Central and Eastern European Service Provider in the 2010 Global Services 100 list. Luxoft is famous for flexible delivery models, world's best IT talent pool, low attrition rates, best-of-breed processes and highest security and quality standards.
Luxoft is trusted long-term partner for world class leaders such as Avaya, Boeing, Dell, Deutsche Bank, IBM.
Currently we are looking for
Data Engineer
Miejsce pracy: Krakow
Nr Ref.: VR-8023
|
Luxoft - Leading Central and Eastern European IT Service Provider. Learn more about Luxoft at www.luxoft.com
|
Responsibilities:
|
|
Requirements:
|
|
Apart from what
everyone else offers, we offer: |
Our new client operates the world's largest network of collaboratively published content on the web. Its product is home to over 400,000 communities for fans, by fans. Our client is often cited as the best source of accurate, dynamic, and fresh fan-authored information that is fueling virtually all aspects of popular culture: console and mobile games, cable and streaming TV shows, music festivals, big movie franchises, major sporting events, anticipated book releases, fashion trends and DIY, food and drink recipes, and current events on an international scale. Client's fan base continues to fuel consistent growth with over 2 billion global monthly page views in over 200 languages -- more than 40% of the traffic coming exclusively from mobile visits. Fans find their favorite fandoms through seven distinct and discoverable hubs. - Games - Movies - TV - Comics - Music - Books - Lifestyle We're looking for Data enthusiasts with a proven track record of setting up large data sets using Hadoop, Hive or Pig for data summarization, queries and analysis and at the same time good in scripting languages such as Perl or/and Python. There are several projects which assume migrating DWH to Hadoop, building new data storages for content aggregation, etc. |