Your guide to getting data entry done for your business
Data entry is an important task, but choosing the wrong solution can seriously harm your company's productivity.
Data Scraping is a process of extracting data from websites and databases. Data Scrapers are experts who can help collect this data and save clients both time and money by automating the process of data collection. These professionals use tools like Python, HTML, XML, Laravel, and more to access webpages and other sources of digital information in order to scrape large amounts of data and analyze it in meaningful ways.
Here's some projects that our expert Data Scraper made real:
Data Scraping is an incredibly valuable tool that can help companies increase efficiency by improving the process of their digital databasing. Our expert team of Data Scrapers is well-equipped to make these improvements in whatever form they’re needed. If you’re looking to make improvements to your own business through data scraping then why not post your project on Freelancer.com? Our Data Scrapers are ready to help you reach your goals.
Kutoka kwa kaguzi 133,648 , wateja wanakadiria yetu Data Scrapers 4.9 kati ya nyota 5.Data Scraping is a process of extracting data from websites and databases. Data Scrapers are experts who can help collect this data and save clients both time and money by automating the process of data collection. These professionals use tools like Python, HTML, XML, Laravel, and more to access webpages and other sources of digital information in order to scrape large amounts of data and analyze it in meaningful ways.
Here's some projects that our expert Data Scraper made real:
Data Scraping is an incredibly valuable tool that can help companies increase efficiency by improving the process of their digital databasing. Our expert team of Data Scrapers is well-equipped to make these improvements in whatever form they’re needed. If you’re looking to make improvements to your own business through data scraping then why not post your project on Freelancer.com? Our Data Scrapers are ready to help you reach your goals.
Kutoka kwa kaguzi 133,648 , wateja wanakadiria yetu Data Scrapers 4.9 kati ya nyota 5.Hi, I need a spreadsheet of 500 car dealerships currently listed on (Ireland's largest classifieds site for vehicles). Requirements: • Only include dealerships with MORE than 25 active ads/listings on DoneDeal • These should be second-hand car dealers — not private sellers • DoneDeal marks dealer listings with a "Dealer" badge, so they're easy to identify Data I need per dealer (in columns): • Dealership name • Contact person (owner/manager if visible) • Email address (from their own website — visit their site and find it on the Contact/About page) • Phone number • Website URL • County/Location • Number of active DoneDeal ads (approximate is fine) How to find them: 1. Go to → Cars section 2. Fil...
Project Overview I am seeking an expert Magento 2 Developer with strong data extraction (scraping) capabilities to build out a comprehensive medical supply catalog. The previous developer started a sample set but is no longer on the project. I need a professional to scrape high-volume product data and perform a structured, complex import into my Magento 2 staging site. Scope of Work 1. Data Scraping & Extraction Source: Target website (medical supply industry) to be provided. Requirements: Full extraction of Product Titles, High-Res Images, Long/Short Descriptions, SKUs, and Technical Specifications. Variation Mapping: Correctly identify and link "Simple" products to their "Configurable" parents (e.g., mapping different sizes or packaging options to a single prod...
I have a collection of Excel and CSV files that need to be consolidated, reviewed, and cleaned so they are ready for analysis and reporting. The raw sheets contain blank cells, occasional typos, inconsistent date and number formats, and some possible duplicate records. **Scope of Work** * Consolidate multiple Excel/CSV files into a single organized workbook. * Enter missing rows or columns based on scanned notes that I will provide (images or PDFs). * Review and correct obvious spelling mistakes and numeric errors. * Standardize formats across all columns (dates, currency values, percentages, etc.). * Identify and remove duplicate records while preserving the most complete or first valid entry. * Ensure the final dataset is accurate, clean, and ready for analysis. **Deliverables** 1. ...
Thanks for looking. I urgently need data for a set of local businesses in and around Berkshire and London UK. We will pay per 2k list of the industries we will send upon acceptance. Email addresses must not be role based or trip any spam traps. I need a one-time extraction of verified email addresses from reputable online business directories. No other data fields are required—just the clean list of emails. Please choose whatever approach you prefer—Python with Scrapy/BeautifulSoup, browser automation with Selenium, or a similar tool chain—as long as the result is accurate and the scraping respects each site’s terms of service and rate limits. Deliverable • A CSV or XLSX file containing every unique email address you capture, de-duplicated and ready ...
We are hiring remote contributors to create photo-based language data using everyday materials found around you. This project focuses on collecting natural, real-life text captured through a phone camera. What You’ll Do - Photograph common objects that contain written text (printed or handwritten). - Provide three unique shots per item, changing position, distance, or lighting. - Ensure content is original and varied. - Most of the visible text (minimum 75%) must be in your local language. Eligibility - Fluent in the target language (native or near-native). - Physically located in a country where the language is used. - Own a smartphone capable of taking clear photos. How It Works - Upload images through a Google Form. - Submissions are reviewed individually. - Only valid, clear, ...
Project Description I’m looking for a developer experienced in browser automation to build a solution that automatically updates SEO fields across many pages in website admin panels. The goal is to automate updating SEO Title and Meta Description using data from a Google Sheet or CSV file. The automation should work on the following platforms: Konimbo admin panel Shopify admin panel And support updating the following page types: Categories / Collections Products Pages Automation workflow The script or extension should perform the following steps: Read a row from a CSV file or Google Sheet Open the corresponding edit_url in the admin panel Wait for the page to fully load Insert the SEO Title into the Title field Insert the Meta Description into the Description field C...
I need a robust yet easy-to-maintain web scraper that pulls player statistics from four different sports sites—a blend of official league pages, sports news outlets, and a couple of well-known fan forums. All scraped data should flow into a single database and surface through a lightweight web dashboard where I can search by player, season, and team, compare numbers side by side, and export results to CSV. My ideal flow looks like this: enter or schedule the URLs, run or auto-run the scraper, watch progress logs, and then immediately view fresh stats inside the dashboard—no command-line work once everything is deployed. If any source changes its HTML, the scraper should fail gracefully and flag the issue in the UI so I can react quickly. Tech stack is flexible; Python with Be...
Overview We are seeking an experienced Laravel Developer / Automation Engineer to architect and build a self-sustaining system that gathers publicly available contact and professional information for individuals working in the low-voltage industry (e.g., low-voltage technicians, structured cabling installers, fire alarm techs, security system installers, access control technicians, etc.). The system will collect relevant professional data from platforms such as LinkedIn and additional job boards where users publicly publish resume and profile information. The final solution must: Operate autonomously once deployed Allow geo-targeted searches by specific City/State Store structured data in a database designed like a spreadsheet Provide exportable, filterable datasets (CSV/Excel) ...
Need experienced dev to fix my claude-coded python playwright automation script. Note that I don't know any coding. So need your help to review and then fix. Details will be provided. Reply with "claude" so I know you've really read this very short job description.
Hi, I’m looking to build a system for affiliate marketing that can automatically collect product deals from popular Indian e-commerce platforms such as Amazon, Flipkart, Myntra, and other similar marketplaces. The goal is to scrape and collect the best deals and organize them so they can be used for affiliate promotion. The system should extract the following product details: Product name Deal price / discounted price MRP / original price Product image Product link (affiliate-ready if possible) The scraper should be reliable and able to run regularly (for every 5 minutes) so that the data stays updated with the latest deals. I’m open to suggestions on the best technical approach to build this (APIs, scraping frameworks, automation, etc.), but it should be stable and scala...
I’m ready to roll out a cold-email campaign aimed at entry-level healthcare organisations and I need someone who can own the entire flow—from building the contact database right through to pressing “send”. First, you’ll scrape and spider fresh, accurate email addresses that fit the brief: front-line or early-stage healthcare institutions. The list must be clean, verified and immediately usable for an email campaign. Once the data is in place, you’ll handle the mailing itself from your own warm, reputable mail server. I’ll supply the message copy and the single call-to-action link; your job is to deliver it to inboxes without triggering spam filters. The primary goal is lead generation and, ultimately, driving web traffic through the link I provid...
Project Overview I have an Excel dataset containing approximately 11,440 licensed plumbing contractors in California (C-36 classification). The dataset includes contractor license information such as business name, city, address, and phone number. I am seeking an experienced data scraping / lead enrichment specialist to locate verified business email addresses associated with these contractors and append them to the spreadsheet. The goal is to identify publicly available contact emails belonging to the contractor’s business and integrate them into the existing dataset. ⸻ Source Dataset The spreadsheet contains approximately 11,440 contractor records and includes fields such as: • License Number • BusinessName • BUS-NAME-2 (DBA name, if present) • FullBus...
We own social media sites like Wellington LIVE, Knock Knock and 15 community facebook groups. Approx 500,000 profiles. We want to scrape Realestate sentiment, and buyer intent our owned pages,,our web sites and from external platforms such as Google, , Trade Me, Stuff, Herald and OneRoof, reddit and any other public forums. Build the AI intent detection engine that identifies property buyer comments, classifies hot/cold leads, applies internal profile masking, and triggers funnels. Then Focus on AI enrichment agents, CRM sync, follow-ups and automation scaling. When buyer intent appears, the system will classify it and push the user into a compliant lead capture funnel, into a realestate agents dashboard as a new warm or hot lead. On platforms we own, we can directly trigger forms and bo...
Saya membutuhkan bantuan untuk mengunduh dan merapikan seluruh laporan tahunan (annual report) perusahaan yang terdaftar di Bursa Efek Indonesia untuk tahun 2021, 2022, 2023, dan 2024. Ruang lingkup pekerjaan: • Cakupan perusahaan: semua emiten yang tercatat di BEI (daftar lengkap akan saya kirim segera setelah proyek dimulai). • Format file: PDF, diambil dari sumber resmi—umumnya situs perusahaan atau IDX. • Struktur folder: satu folder per sektor industri (mengikuti klasifikasi BEI) per emiten. • Penamaan file: kode , contoh: TLKM_2022.pdf. Keluaran yang saya harapkan: 1. Arsip utama berisi seluruh folder sektor dengan laporan di dalamnya. 2. Daftar emiten dikategorikan per sektornya dalam setiap folder, buat rincian dalam excel dan beri catata...
I need a fresh, accurate database of Australian business-to-business contacts collected and organised for easy import into my CRM. The dataset must cover companies operating nationwide and include decision-makers rather than generic info addresses. Please gather the following for each record: • Company name • Full contact name • Role title • Direct email address • Direct phone number (landline or mobile) Quality is more important than sheer volume. Every email and phone number must be active, each name correctly matched to the right company and role, and duplicates removed. Publicly available sources, official company sites, and reliable business directories are all acceptable as long as you respect Australian data-privacy regulations. No purchased or o...
My goal is to build a reliable pipeline of Manchester-area Airbnb hosts—both brand-new listings and seasoned operators. For every lead you uncover, I’m paying on a per-lead basis, so accuracy matters more than volume. What I need from you • Verified contact details for each host (name, direct email, and, when possible, phone). • Coverage across central Manchester and the surrounding suburbs; any property type is welcome as long as it’s listed on Airbnb. • Leads delivered straight to my inbox the moment they’re vetted, so I can act on them without delay. A lead counts as qualified only when the contact information is up-to-date and reaches the actual decision-maker for the listing. Bounce-backs or generic Airbnb addresses won’t be accepte...
I need a steady flow of high-quality leads in the technology space, covering both B2B and B2C segments. Your job is to research, verify, and deliver fresh contacts that fit clear targeting criteria we will finalise together at kickoff. What I expect from you: • A clean, well-formatted spreadsheet (CSV or Google Sheet) containing each lead’s name, role (or consumer profile), company (when applicable), email, and phone where available. • Verifiable sources so I can spot-check accuracy. • Brief notes on your outreach or data-gathering process, so I understand how each lead was obtained. When you respond, attach a detailed project proposal explaining: – Your research methodology and any tools or platforms you rely on (LinkedIn Sales Navigator, ZoomInfo, web...
International Lead Generation Expert Needed - Web Development Prospects (Verified & High-Intent Only). Description: My web development agency is scaling up, and I need someone who can consistently deliver qualified prospects from specific regions - not just contact lists, but real opportunities. The Challenge: I'm tired of buying generic databases or chasing cold leads that go nowhere. I want contacts who are actually in-market for web development services now - companies planning builds, frustrated with current developers, or actively posting RFPs. Who I'm Looking For: Businesses or individuals that need: Custom website development Platform redesigns or migrations Ongoing technical development partnerships Decision-makers only: CTOs, founders, marketing heads, operation...
I want to scrape all records from 8 URLs. Please check attached excel file for output sample.
My main AutoCAD software crashed and I’m locked out of several critical DWG files. I need someone who can jump on a remote session right away, reinstall AutoCAD 2018 or any newer release, activate it, and then open every drawing intact so no geometry, layers, Xrefs or sheet-set data goes missing. Here’s what success looks like for me: • AutoCAD 2018 (or any more recent release you recommend) installed cleanly, fully updated and opening without errors. • All existing project files opened, audited and saved back with zero data loss; if you need to use RECOVER, RECOVERALL, command-line AUDIT, or third-party utilities, that’s fine as long as the model space and layouts stay intact. • A short note on what caused the failure and any preventive tweaks&md...
I need clean, reliable product-level data pulled from popular e-commerce sites and delivered in a structured format I can feed straight into my analysis pipeline. The focus is on product details only—titles, descriptions, specifications, images, SKU, category tags and anything else that properly describes the item itself. Price information or customer reviews are not essential right now, but building the scraper in a way that could be extended to capture those later would be appreciated. Your crawler should respect , rotate proxies or user-agents as needed to avoid blocks, and export the results as either CSV or JSON; if you can offer both, even better. I’ll point you to the first two target sites once we start, then we can iterate quickly on additional domains that share a si...
Wordpress developer and Data Entry expert needed Job description About Teckmo: Teckmo is a fast-growing digital agency offering web development, WordPress solutions, SEO, and digital marketing services. We work with startups and global clients to build high-performing websites and impactful digital campaigns. Job Overview: We are looking for a WordPress Developer who is passionate about coding and eager to build a strong career in web development. Whether you're a fresher or have up to 10 year of experience, if you have the basics of PHP and WordPress, we’d love to hear from you. Its work from home position with salary range of ₹12,000 to ₹75,000/mo. Responsibilities: • Assist in developing and customizing WordPress websites. • Support in creating and editing the...
I need a web app that pulls live prices from multiple peptide vendors, stores them, and then presents the data in a sortable table that ranks each seller from least to most expensive. In that same table I want to show two key columns—price per unit and product rating—so users can decide quickly who offers the best deal and quality. Data collection • The prices should refresh automatically once a week without me touching anything. A lightweight crawler, API integration, or another reliable method is fine so long as it respects each vendor’s terms. • When a vendor changes a product name, drops stock, or adds a variant, the system needs to recognise and reflect that on the next scheduled pull. Vendor interaction A simple “Submit Your Store” form...
Data entry is an important task, but choosing the wrong solution can seriously harm your company's productivity.
Learn how to hire and collaborate with a freelance Typeform Specialist to create impactful forms for your business.
A complete guide to finding, hiring, and working with a skilled freelance typist for your typing projects.