r - read.table returns extra rows -
i working textfiles of many, long rows varying number of elements. each element in rows separated \t , of course rows terminated \n. i'm using read.table read textfiles. example samplefile this: https://www.dropbox.com/s/6utslbnwerwhi58/samplefile.txt
the sample file has 60 rows.
code read file:
sampledata <- read.table("samplefile.txt", as.is=true, fill = true); dim(sampledata);
the dim returns 70 rows when in fact should 60. when try nrows=60 like
sampledata <- read.table("samplefile.txt", as.is=true, fill = true, nrows = 60); dim(sampledata);
it work, however, don't know if doing delete of information. suspicion last portions of of rows added new rows. don't know why case, however, have fill = true;
i have tried
na.strings = "na", fill=true, strip.white=true, blank.lines.skip = true, stringsasfactors=false, quote = "", comment.char = ""
but no avail.
does have idea might going on?
in absence of reproducible example, try this:
# make fake data r <- c("1 2 3 4","2 3 4","4 5 6 7 8") writelines(r, "samplefile.txt") # read line line r <- readlines("samplefile.txt") # split sep sp <- strsplit(r, " ") # make each list of dataframes (for rbind.fill) sp <- lapply(sp, function(x)as.data.frame(t(x))) # bind library(plyr) rbind.fill(sp)
if similar actual problem, anyway.
Comments
Post a Comment