The Ethical Web Data Collection Initiative (EWDCI) is an industry-led consortium of web data collectors focused on strengthening public trust, promoting ethical guidelines, and helping businesses and their customers make informed data extraction choices. The association aims to raise the bar for ethics in the process widely known as “data scraping” with the goal of enhancing trust—a key component of a free, fair, and open Internet.
Building an E-commerce Scraper
In this sponsored post, we sit down with Aleksandras Šulženko, Product Owner at Oxylabs.io, to discuss the past, present, and future of web scraping. He has been involved with both the technical and business side of automated collection for several years and has helped create some of the leading web scraping solutions on the market such as the E-commerce Scraper API, dedicated to collecting external publicly available data from e-commerce websites.
Using Node.js To Explain How Scraping Has Changed eCommerce
In this contributed article, Christoph Leitner from Zenscrape.com covers the basics of what Node.js is before moving on to its impact on eCommerce. There are three main reasons that Node.js has taken web scraping to never-before-seen heights of importance for eCommerce: speed, cost, and customizability.
Multi-Billion Dollar Businesses Benefit From Web Scraping. Can Yours?
In this contributed article, Andrius Palionis,VP of Enterprise Solutions at Oxylabs, discusses how businesses of all sizes can benefit from web scraping. Billion-dollar businesses got to where they are today by leading the industry in technological innovation. That’s because data continues to increase in importance and literally “fuels” the digital age. Smaller companies now have the opportunity to leverage the same technology that provides the critical data needed to thrive on today’s competitive business landscape.
Interview: Luminati CEO, Or Lenchner
I recently caught up with Or Lenchner, CEO at Luminati, to discuss his company’s Data Collector product, an automated data collection tool, allowing customers to collect the most accurate data at scale quickly, easily, and without getting blocked. The Data Collector integrates and automates all stages of the data collection process for customers but leaves them in full control over the data they collect.
Diffbot State of Data Science, Engineering & AI Report – 2019
Diffbot, the AI startup with more knowledge than Google, released a new report on the state of the data science industry. The company developed the report using the Diffbot Knowledge Graph, an AI-curated, structured database of all of the public information on the web. Key findings include: IBM is leading in workforce size across all […]
What is Web Scraping?
In this contributed article, Hoda Raissi, COO of ParseHub, introduces web scraping and its importance to researchers and to various industries. She also shares her insights on what to look out for when choosing a web scraping tool, and how to make sure it will provide you the data you need in the format you need, before you invest your time and money into the tool.