17 thoughts on “Extracting data from facebook using R

  1. hallo
    I have a list of facebook pages in .csv format,i need to collect data from those pages,more than 1000 pages using rstudio,when i import the list files into r i can see all the pages lists and i am able to see the number of comments collected from each page,and i am able to write the output data in a .csv file.The problems I am facing are
    1.i get only six output files from only six page,while the input page list is more than 1000 pages
    2.my program terminate when it find 0 post on a page,and even when 20 posts were found before the termination only six files are written in my directory.
    i need help on how to solve the two problem

  2. > connect <- getURL(f_url)
    Error in function (type, msg, asError = TRUE) :
    Illegal characters found in URL

    i got the error while i was extracting the data from facebook using R programming.let me know the solution.

    • I have used following code to extract data from Facebook groups a few days back and worked file..

      install.packages(c("RCurl","rjson","Rfacebook"))
      accessToken <-"XXXXXXXXXX" library(RCurl) library(rjson) library(Rfacebook) sas.group <-getGroup(group_id = 57805272392, token=accessToken, n=5000) If you share full code then we can try to help you..

  3. I found the solution for my problem few months ago,sorry for late update.Some names on the list didn't have facebook pages,so when there was no facebook page found it when the program terminated,but after i removed the names which didn't have facebook pages the loop kept on running to the last name.

  4. Getting this error "Error in curl::curl_fetch_memory(url, handle = handle) :
    Timeout was reached" when running "me <- getUsers("me", token = acess_token)"
    Can you Help, and if there is any other way by which we can extrat data from facebook using R.

    • Hi Vishal, what information are you looking to extract? I was able to run below code and get data ..install.packages(c("RCurl","rjson","Rfacebook"))

      accessToken <-"XXXXXXXXXXXXXXXXXXXX" library(RCurl) library(rjson) library(Rfacebook) me <- getUsers("me", token = accessToken)

  5. You can use an R web crawler and scraper called RCrawler, it's designed to crawl, parse, store and extract contents of web page automatically.
    install.packages("Rcrawler")
    see manual for more detail.

  6. Thanks for sharing. There's another way to extract data from Facebook. Just use the web scraping tools like Octoparse(www.octoparse.com), it would be much easier and helpful.

Leave a Comment