Find Jobs
Hire Freelancers

Integrate our big data infrastructure in the Amazon web services cloud

$12-25 USD / hour

Imefungwa
Imechapishwa over 6 years ago

$12-25 USD / hour

We are seeking a developer to partner with other engineering teams to help architect and build the data pipeline that ingest hundreds of billions of data points for our Field Analytics Platform utilizing AWS. Expand capability using various open source data processing technologies like Hadoop, Kafka, Spark, Cassandra and Neo4J into our infrastructure. Become an expert of AWS services that we leverage. Help to efficiently integrate our big data infrastructure in the AWS cloud. Build services, deploy models, algorithms, perform model training and provide tools to make our infrastructure more accessible to all our data scientists. Enable specific initiatives to build our capabilities from environmental classification to In-Season Field Analytics and more. Requires a degree or 15+ years experience in field or related area.
Kitambulisho cha mradi: 15937640

Kuhusu mradi

12 mapendekezo
Mradi wa mbali
Inatumika 6 yrs ago

Unatafuta kupata pesa?

Faida za kutoa zabuni kwenye Freelancer

Weka bajeti yako na muda uliopangwa
Pata malipo kwa kazi yako
Eleza pendekezo lako
Ni bure kujiandikisha na kutoa zabuni kwa kazi
12 wafanyakazi huru wana zabuni kwa wastani $21 USD/saa kwa kazi hii
Picha ya Mtumiaji
yes i am having good knowledge in AWS architecture with various api's...lets discuss more about current setup and needed structure for doing further analysis...
$25 USD ndani ya siku 40
4.9 (16 hakiki)
6.1
6.1
Picha ya Mtumiaji
Hi, I have more than 3+ years of experience in hadoop technologies please contact me for more details
$20 USD ndani ya siku 40
4.8 (13 hakiki)
4.6
4.6
Picha ya Mtumiaji
I am very strong in aws with bigdata implementation. I have done end to end implementation of hadoop ecosystem on aws ec2. strong in spark, kafka, hadoop, hive dwh, new technology as well....
$20 USD ndani ya siku 40
5.0 (7 hakiki)
3.9
3.9
Picha ya Mtumiaji
hello, I have 2 years of experience in big data technologies. I have done many projects on Hadoop, Spark, Flink, Kafka, Strom, R, machine learning, etc. I currently work as Big data administrator and developer.
$22 USD ndani ya siku 20
4.8 (8 hakiki)
3.3
3.3
Picha ya Mtumiaji
Certified HADOOP ADMIN with expertise in Expert in Hadoop Ecosystem, Apache hadoop, HDFS, Hive,Ambari,HortonWorks, Hbase, Sql, Cloudera,MongoDB,Docker, Kafka, Storm, AWS,Linux, mysql, Nagios,Ganglia, Titan,Graph Database,Hbase,Solr,Kerberos. Will give FREE demo and end to end solutioning with FREE troubleshooting.
$12 USD ndani ya siku 40
0.0 (0 hakiki)
0.0
0.0
Picha ya Mtumiaji
Iam a AWS Engineer having 2.8 yrs of IT experience with working experience on Devops tools as well including openshift automation platform(Dockery, Kubernetes ) Relevant Skills and Experience Version control as GIT, CI/CD tool jenkins, configuration management tool CHEF
$20 USD ndani ya siku 30
0.0 (0 hakiki)
0.0
0.0
Picha ya Mtumiaji
We have extensive experience in working with global business advisory, ERP, HRMS and software publisher companies, we help customers with the in-depth knowledge and transparency on licensing solutions and technology advisory services that can deliver cost effective solutions and forward-thinking approach. Thanks to a highly skilled and flexible team, Online24x7 in is in the position of providing services in the following areas: • Microsoft Dynamics 365 • Ramco Implementation (ERP & HCM/HRMS) • PeoplesHR Implementation (HCM/HRMS) • IBM Products • DataMatics Implementation for Analytics and BI (Business Intelligence) tool • Inventory Management or Asset Management Tools • Brillio Implementation partner for E-commerce & SOW based work • BPO/KPO Services • AWS/Azure Cloud base services • Training & Development ( Amazon Web Services, Microsoft Azure, DevOps, Linux, Unix, Machine Learning, Big Data, Hadoop, Salesforce, Advance Excel, Cloud Computing, IT Infrastructure ) • Resources Pooling • Microsoft Licenses We have well experienced AWS, Cloud,DevOps consultants in our team. We are keen to associate with you.
$27 USD ndani ya siku 40
0.0 (0 hakiki)
0.0
0.0
Picha ya Mtumiaji
• 10 year of experience in IT industry, involving Digital Analytics and Data Warehousing and Database development. • Expert level skills and hands on experience on Hadoop, Hive, Pig, Sqoop, Netezza, SQL, Oracle PLSQL, ETL, Apache Spark and Python. • Experience in migration product platform from Netezza to Hadoop. • Worked on Hortonworks Distribution. • Experience with Amazon Web Service (AWS). • Design and Requirement Gathering, Development and Unit testing • Expertized in shell scripting and scheduling Jenkins.  Expertise in developing and implementing Hive, Pig scripts and Sqoop commands.  Worked with Infra team on cluster configuration and implementing it for various clients.  Good knowledge and understanding on Map Reduce framework and Yarn.
$20 USD ndani ya siku 40
0.0 (0 hakiki)
0.0
0.0

Kuhusu mteja

Bedera ya INDIA
HYDERABAD, India
5.0
2
Njia ya malipo imethibitishwa
Mwanachama tangu Feb 1, 2012

Uthibitishaji wa Mteja

Asante! Tumekutumia kiungo cha kudai mkopo wako bila malipo kwa barua pepe.
Hitilafu fulani imetokea wakati wa kutuma barua pepe yako. Tafadhali jaribu tena.
Watumiaji Waliosajiliwa Jumla ya Kazi Zilizochapishwa
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Onyesho la kukagua linapakia
Ruhusa imetolewa kwa Uwekaji wa Kijiografia.
Muda wako wa kuingia umeisha na umetoka nje. Tafadhali ingia tena.