Find Jobs
Hire Freelancers

hadoop development project

min ₹2500 INR / hour

Imefungwa
Imechapishwa over 5 years ago

min ₹2500 INR / hour

am looking for developer who has hands one experience on hadoop with spark and scala
Kitambulisho cha mradi: 17887575

Kuhusu mradi

15 mapendekezo
Mradi wa mbali
Inatumika 5 yrs ago

Unatafuta kupata pesa?

Faida za kutoa zabuni kwenye Freelancer

Weka bajeti yako na muda uliopangwa
Pata malipo kwa kazi yako
Eleza pendekezo lako
Ni bure kujiandikisha na kutoa zabuni kwa kazi
15 wafanyakazi huru wana zabuni kwa wastani ₹2,674 INR/saa kwa kazi hii
Picha ya Mtumiaji
Hi, I have 7 years of experience and working on hadoop, spark, nosql, java, BI tools(tableau, powerbi), cloud(Amazon, Google, Microsoft Azure)... Done end to end data warehouse management projects on aws cloud with hadoop, hive, spark and presodb. Worked on multiple etl project like Kafka, nifi, flume, mapreduce, spark with XML/JSON., Cassandra, mongodb, hbase, redis, oracle, sap hana, ASE.... Many more. Let's discuss the required things in detail. I am committed to work done and strong in issue resolving as well. Thanks
₹2,500 INR ndani ya siku 40
5.0 (8 hakiki)
4.1
4.1
Picha ya Mtumiaji
Hi, I am a hadoop data engineer in paypal inc. I can work in my spare time. I am familiar with spark and scala. look forwards to work with you.
₹2,500 INR ndani ya siku 28
0.0 (0 hakiki)
0.0
0.0
Picha ya Mtumiaji
Hi We are team of Big Data and Hadoop Developers with more than 9 years of extensive experience, We have strong experience in Hadoop, Spark, Scala, Hive, Hbase, Mapreduce, Java and Python. We can surely work on your project. Some of our past projects: We are currently working on Data Warehousing project for Security and Law firm in London for Security Log generation and analysis. We are using Apache NiFi for data stream ingestion, we are processing this data stream with Apache spark then lookup is done using File and Hbase. We have developed dashboard for log monitoring using Kibana. We are also working for one of the Enterprise client where we are pulling data from different payload assets which goes from IoT Hub into Spark, we process the data in Spark and after processing we dump it into Cassandra, redis, Hbase and InMemory. Lets connect and discuss more about requirement. Thank you.
₹2,500 INR ndani ya siku 40
0.0 (0 hakiki)
0.0
0.0
Picha ya Mtumiaji
Hi, I have two years of experience in installing, configuring, testing Hadoop ecosystem components. I am capable of processing large sets of structured, semi-structured & unstructured data. Relevant Skills and Experience: I am familiar with data architecture including data ingestion, pipeline design, Hadoop information architecture, data modeling & data mining, machine learning, spark, scala & advanced data processing. I need your exact requirement to move forward with this job. Talk Soon! Garima
₹2,777 INR ndani ya siku 40
0.0 (0 hakiki)
0.0
0.0
Picha ya Mtumiaji
Need more details about the project so that we can discuss the approach on how to proceed with development.
₹2,777 INR ndani ya siku 40
0.0 (0 hakiki)
0.0
0.0
Picha ya Mtumiaji
I have a good experience with major banks like Royal Bank of Canada.
₹2,777 INR ndani ya siku 10
0.0 (0 hakiki)
0.0
0.0
Picha ya Mtumiaji
extensive experience in spark with Scala and all the major hadoop components including kafka,elastic search. ensure on time delivery.
₹2,555 INR ndani ya siku 20
0.0 (0 hakiki)
0.0
0.0

Kuhusu mteja

Bedera ya UNITED STATES
Prospect Heights, United States
0.0
0
Mwanachama tangu Okt 2, 2018

Uthibitishaji wa Mteja

Asante! Tumekutumia kiungo cha kudai mkopo wako bila malipo kwa barua pepe.
Hitilafu fulani imetokea wakati wa kutuma barua pepe yako. Tafadhali jaribu tena.
Watumiaji Waliosajiliwa Jumla ya Kazi Zilizochapishwa
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Onyesho la kukagua linapakia
Ruhusa imetolewa kwa Uwekaji wa Kijiografia.
Muda wako wa kuingia umeisha na umetoka nje. Tafadhali ingia tena.