[R] R crashing with ggplot 2d histogram
Hi! R newbie here. I wrote a script for a correlation plot, a 2d histogram (heatmap-like) with ggplot. I've run it before with a smaller dataset and it runs on my laptop and does what I want. Now I've extended my dataset and R is crashing after the last line which is to generate the plot I guess. I get the R session aborted / fatal error message. My new dataset is composed of time series for two variables. For each variable there are 50 time series with 15.000 values each, so it's in total 15000*50= 7.5E5 coordinates for this 2d histogram. Do I have a memory problem here or is it just a script issue? Any suggestion to make it run? An eli5 answer would be appreciated, my R experience is relatively low, I just use it here and there for plotting. Thanks! Carlos Zapata __ R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
[R] (no subject)
Hi, I have a 64 bits machine (Windows) with a total of 192GB of physical memory (RAM), and total of 8 CPU. I wanted to ask how can I make R make use of all the memory. I recently ran a script requiring approximately 92 GB of memory to run, and got the massage: cannot allocate memory block of size 2.1 Gb I read on the web that if you increase the memory you have to reinstall R; would that be enough. Could I just increase the memory manually. Take you for any comments, or links on the web. EZ [[alternative HTML version deleted]] __ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
[R] R Memory Issues
-- Forwarded message -- From: Emiliano Zapata Date: Sun, May 20, 2012 at 12:09 PM Subject: To: R-help@r-project.org Hi, I have a 64 bits machine (Windows) with a total of 192GB of physical memory (RAM), and total of 8 CPU. I wanted to ask how can I make R make use of all the memory. I recently ran a script requiring approximately 92 GB of memory to run, and got the massage: cannot allocate memory block of size 2.1 Gb I read on the web that if you increase the memory you have to reinstall R; would that be enough. Could I just increase the memory manually. Take you for any comments, or links on the web. EZ [[alternative HTML version deleted]] __ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
Re: [R] R Memory Issues
Already then, thank you everyone. This information was extremly useful, and I'll do a better job on the web next time. On Sun, May 20, 2012 at 2:10 PM, Prof Brian Ripley wrote: > On 20/05/2012 18:42, jim holtman wrote: > >> At the point in time that you get the error message, how big are the >> objects that you have in memory? What does 'memory.size()' show as >> being used? What does 'memory.limit()' show? Have you tried using >> 'gc()' periodically to do some garbage collection? It might be that >> you memory is fragmented. You need to supply some additional >> information. >> > > Either this is a 32-bit version of R in which case the wrong version is > being used, or your advice is wrong: there are no credible fragmentation > issues (and no need to use gc()) on a 64-bit build of R. > > But, we have a posting guide, we require 'at a minimum information', and > the OP failed to give it to us so we are all guessing, completely > unnecessarily. > > > >> On Sun, May 20, 2012 at 12:09 PM, Emiliano Zapata >> wrote: >> >>> -- Forwarded message -- >>> From: Emiliano Zapata >>> Date: Sun, May 20, 2012 at 12:09 PM >>> Subject: >>> To: R-help@r-project.org >>> >>> >>> Hi, >>> >>> I have a 64 bits machine (Windows) with a total of 192GB of physical >>> memory >>> (RAM), and total of 8 CPU. I wanted to ask how can I make R make use of >>> all >>> the memory. I recently ran a script requiring approximately 92 GB of >>> memory >>> to run, and got the massage: >>> >>> cannot allocate memory block of size 2.1 Gb >>> >>> >>> >>> I read on the web that if you increase the memory you have to reinstall >>> R; >>> would that be enough. Could I just increase the memory manually. >>> >>> >>> Take you for any comments, or links on the web. >>> >>> >>> EZ >>> >>>[[alternative HTML version deleted]] >>> >>> __** >>> R-help@r-project.org mailing list >>> https://stat.ethz.ch/mailman/**listinfo/r-help<https://stat.ethz.ch/mailman/listinfo/r-help> >>> PLEASE do read the posting guide http://www.R-project.org/** >>> posting-guide.html <http://www.R-project.org/posting-guide.html> >>> and provide commented, minimal, self-contained, reproducible code. >>> >> >> >> >> > > -- > Brian D. Ripley, rip...@stats.ox.ac.uk > Professor of Applied Statistics, > http://www.stats.ox.ac.uk/~**ripley/<http://www.stats.ox.ac.uk/~ripley/> > University of Oxford, Tel: +44 1865 272861 (self) > 1 South Parks Road, +44 1865 272866 (PA) > Oxford OX1 3TG, UKFax: +44 1865 272595 > [[alternative HTML version deleted]] __ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
[R] utils:::menuInstallLocal()
Hello R, I'm trying to install a package (class) locally; in windows 7, 64 bits machine. The only massage I see on the R Console is: utils:::menuInstallLocal() nothing else. What does this means, shouldn't I got some source of massage on the Console. EZ [[alternative HTML version deleted]] __ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
Re: [R] R Memory Issues
As a continuation to my original question, here is the massage that I get: Error in glm.fit(x = structure(c(1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, : cannot allocate memory block of size 2.1 Gb The model "glm.fit" is a logistic type (in the family of GLM) model. Maybe this is not enough information; again!, but some feedback will be appreciated. To me the issues appears to be associated with manipulation of large dataset. Howeverl the algorithm runs fine in Unix; but not in Windows (64 bits windows 7). EZ On Sun, May 20, 2012 at 4:09 PM, Emiliano Zapata wrote: > Already then, thank you everyone. This information was extremly useful, > and I'll do a better job on the web next time. > > On Sun, May 20, 2012 at 2:10 PM, Prof Brian Ripley > wrote: > >> On 20/05/2012 18:42, jim holtman wrote: >> >>> At the point in time that you get the error message, how big are the >>> objects that you have in memory? What does 'memory.size()' show as >>> being used? What does 'memory.limit()' show? Have you tried using >>> 'gc()' periodically to do some garbage collection? It might be that >>> you memory is fragmented. You need to supply some additional >>> information. >>> >> >> Either this is a 32-bit version of R in which case the wrong version is >> being used, or your advice is wrong: there are no credible fragmentation >> issues (and no need to use gc()) on a 64-bit build of R. >> >> But, we have a posting guide, we require 'at a minimum information', and >> the OP failed to give it to us so we are all guessing, completely >> unnecessarily. >> >> >> >>> On Sun, May 20, 2012 at 12:09 PM, Emiliano Zapata >>> wrote: >>> >>>> -- Forwarded message -- >>>> From: Emiliano Zapata >>>> Date: Sun, May 20, 2012 at 12:09 PM >>>> Subject: >>>> To: R-help@r-project.org >>>> >>>> >>>> Hi, >>>> >>>> I have a 64 bits machine (Windows) with a total of 192GB of physical >>>> memory >>>> (RAM), and total of 8 CPU. I wanted to ask how can I make R make use of >>>> all >>>> the memory. I recently ran a script requiring approximately 92 GB of >>>> memory >>>> to run, and got the massage: >>>> >>>> cannot allocate memory block of size 2.1 Gb >>>> >>>> >>>> >>>> I read on the web that if you increase the memory you have to reinstall >>>> R; >>>> would that be enough. Could I just increase the memory manually. >>>> >>>> >>>> Take you for any comments, or links on the web. >>>> >>>> >>>> EZ >>>> >>>>[[alternative HTML version deleted]] >>>> >>>> __** >>>> R-help@r-project.org mailing list >>>> https://stat.ethz.ch/mailman/**listinfo/r-help<https://stat.ethz.ch/mailman/listinfo/r-help> >>>> PLEASE do read the posting guide http://www.R-project.org/** >>>> posting-guide.html <http://www.R-project.org/posting-guide.html> >>>> and provide commented, minimal, self-contained, reproducible code. >>>> >>> >>> >>> >>> >> >> -- >> Brian D. Ripley, rip...@stats.ox.ac.uk >> Professor of Applied Statistics, >> http://www.stats.ox.ac.uk/~**ripley/<http://www.stats.ox.ac.uk/%7Eripley/> >> University of Oxford, Tel: +44 1865 272861 (self) >> 1 South Parks Road, +44 1865 272866 (PA) >> Oxford OX1 3TG, UKFax: +44 1865 272595 >> > > [[alternative HTML version deleted]] __ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
[R] Memory Windows
This is my current version of R: R version 2.15.0 (2012-03-30) Copyright (C) 2012 The R Foundation for Statistical Computing ISBN 3-900051-07-0 Platform: x86_64-pc-mingw32/x64 (64-bit) OS: Windows I get the following: > memory.size() [1] 23.04 > memory.limit() [1] 190600 > I also have the following: C:\Users\zapatae\Desktop\Downloads\R-2.15.0\bin\x64\Rgui.exe --max-mem-size=190600M However, R doesn't appears to be using available memory as I keep getting an error that can't allocated 2.5 GB. I was wondering if there is reference discussing the issues associated with how R allocates memory in Windows. No problems in Unix. EZ [[alternative HTML version deleted]] __ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
[R] R code: How to correct "Error in parse(text = x, keep.source = FALSE)" output in psych package using own dataset
ores with factors0.90 0.68 0.80 0.85 Multiple R square of scores with factors 0.81 0.46 0.64 0.73 Minimum correlation of factor score estimates 0.62 -0.08 0.27 0.45 Total, General and Subset omega for each subset g F1* F2* F3* Omega total for total scores and subscales0.93 0.92 0.82 0.80 Omega general for total scores and subscales 0.79 0.69 0.48 0.50 Omega group for total scores and subscales0.14 0.23 0.35 0.31 To get the standard sem fit statistics, ask for summary on the fitted object> ``` I'm expecting to have the same output applying the function directly. My expectation is to make sure if its mandatory to make the schmid transformation before the omegaSem(). I'm supposing that not, because its not supposed to work like that as it says in the guide. Maybe this can be solved correcting the error message: ``` > r9 <- my.data > omegaSem(r9,n.obs=198) Error in parse(text = x, keep.source = FALSE) : :2:0: unexpected end of input 1: ~ ^ ``` Hope I've been clear enough. Feel free to ask any other information that you might need. Thank you so much for giving me any guidance to reach the answer of this issue. I higly appreciate any help. Regards, Danilo -- Danilo E. Rodríguez Zapata Analista en Psicometría CEBIAC [[alternative HTML version deleted]] __ R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
Re: [R] R code: How to correct "Error in parse(text = x, keep.source = FALSE)" output in psych package using own dataset
Dear William, Thank you for your answer, I would like to add some information that I just obtained looking in different sites and forums. Someone there ask me to create a fake data file, so I did that from my original data file. What I did was open the .csv file with notepad and replace all the 4 for 5 and the 2 for 1, then I saved the file again with no other changes. I also searched for the "~" in the file and I found nothing. Now with that file I did the omegaSem() function and it worked succesfully, so the weird thing here is that the omegaSem() function works with the fake data file, wich is exactly the same as the original file, but recoding some answers as I said. It seems to be an issue with the file. When I replace, lets say, the 5 for 6 and make the omegaSem() again, it works. Then I replace back again the 6 for 5 in all the data and the function doesn't works anymore. El jue., 29 ago. 2019 a las 12:33, William Dunlap () escribió: > > omegaSem(r9,n.obs=198) > Error in parse(text = x, keep.source = FALSE) : > :2:0: unexpected end of input > > This error probably comes from calling factor("~") and > psych::omegaSem(data) will do that if all the columns in data are very > highly correlated with one another. In that case omega(data, nfactor=n) > will not be able to find n factors in the data but it returns "~" in place > of the factors that it could not find. E.g., > > fakeData <- data.frame(A=1/(1:40), B=1/(2:41), C=1/(3:42), D=1/(4:43), > E=1/(5:44)) > > cor(fakeData) > A B C D E > A 1.000 0.9782320 0.9481293 0.9215071 0.8988962 > B 0.9782320 1.000 0.9932037 0.9811287 0.9684658 > C 0.9481293 0.9932037 1.000 0.9969157 0.9906838 > D 0.9215071 0.9811287 0.9969157 1.000 0.9983014 > E 0.8988962 0.9684658 0.9906838 0.9983014 1.000 > > psych::omegaSem(fakeData) > Loading required namespace: lavaan > Loading required namespace: GPArotation > In factor.stats, I could not find the RMSEA upper bound . Sorry about that > Error in parse(text = x, keep.source = FALSE) : > :2:0: unexpected end of input > 1: ~ >^ > In addition: Warning message: > In cov2cor(t(w) %*% r %*% w) : > diag(.) had 0 or NA entries; non-finite result is doubtful > > psych::omega(fakeData)$model$lavaan > In factor.stats, I could not find the RMSEA upper bound . Sorry about that > [1] g =~ +A+B+C+D+E F1=~ + B + C + D + E F2=~ + A > [4] F3=~ > Warning message: > In cov2cor(t(w) %*% r %*% w) : > diag(.) had 0 or NA entries; non-finite result is doubtful > > You can get a result if you use nfactors=n where n is the number of the > good F entries in psych::omega()$model$lavaan: > > psych::omegaSem(fakeData, nfactors=2) > ... > > Measures of factor score adequacy >gF1* F2* > Correlation of scores with factors 11.35 12.4284.45 > Multiple R square of scores with factors 128.93 154.32 7131.98 > Minimum correlation of factor score estimates 256.86 307.64 14262.96 > ... > Does that work with your data? > > This is a problem that the maintainer of psych, > > maintainer("psych") > [1] "William Revelle " > would like to know about. > > > > > > > Bill Dunlap > TIBCO Software > wdunlap tibco.com > > > On Thu, Aug 29, 2019 at 9:03 AM Danilo Esteban Rodriguez Zapata via R-help > wrote: > >> This is a problem related to my last question referred to the omegaSem() >> function in the psych package (that is already solved because I realized >> that I was missing a variable assignment and because of that I had an >> 'object not found' error: >> >> >> https://stackoverflow.com/questions/57661750/one-of-the-omegasem-function-arguments-is-an-object-not-found >> >> I was trying to use that function following the guide to find McDonald's >> hierarchical Omega by Dr William Revelle: >> >> http://personality-project.org/r/psych/HowTo/omega.pdf >> >> So now, with the variable error corrected, I'm having a different error >> that does not occur when I use the same function with the example database >> (Thurstone) provided in the tutorial that comes with the psych package. I >> mean, I'm able to use the function succesfully using the Thurstone data >> (with no other action, I have the expected result) but the function >> doesn't >> work when I use my own data. >> >> I searched over other posted questions, and the actions that they perform >> are not even similar to what I'm trying to do. I have almost two weeks >> using R, so I'
Re: [R] R code: How to correct "Error in parse(text = x, keep.source = FALSE)" output in psych package using own dataset
well the output with the code that you refer is the following: > psych::omega(my.data)$model$lavaan [1] g =~ +AUT_10_04+AUN_07_01+AUN_07_02+AUN_09_01+AUN_10_01+AUT_11_01+AUT_17_01+AUT_20_03+CRE_05_02+CRE_07_04+CRE_10_01+CRE_16_02+EFEC_03_07+EFEC_05+EFEC_09_02+EFEC_16_03+EVA_02_01+EVA_07_01+EVA_12_02+EVA_15_06+FLX_04_01+FLX_04_05+FLX_08_02+FLX_10_03+IDO_01_06+IDO_05_02+IDO_09_03+IDO_17_01+IE_01_03+IE_10_03+IE_13_03+IE_15_01+LC_07_03+LC_08_02+LC_11_03+LC_11_05+ME_02_03+ME_07_06+ME_09_01+ME_09_06+NEG_01_03+NEG_05_04+NEG_07_03+NEG_08_01+OP_03_05+OP_12_01+OP_14_01+OP_14_02+ORL_01_03+ORL_03_01+ORL_03_05+ORL_10_05+PER_08_02+PER_16_01+PER_19_06+PER_22_06+PLA_01_03+PLA_05_01+PLA_07_02+PLA_10_01+PLA_12_02+PLA_18_01+PR_06_02+PR_15_03+PR_25_01+PR_25_06+REL_09_05+REL_14_03+REL_14_06+REL_16_04+RS_02_03+RS_07_05+RS_08_05+RS_13_03+TF_03_01+TF_04_01+TF_10_03+TF_12_01+TRE_09_05+TRE_09_06+TRE_26_04+TRE_26_05 [2] F1=~ [3] F2=~ + AUN_07_02 + CRE_05_02 + CRE_07_04 + CRE_16_02 + EFEC_09_02 + EVA_12_02 + FLX_08_02 + IDO_01_06 + IDO_05_02 + LC_08_02 + LC_11_03 + LC_11_05 + ME_02_03 + ME_07_06 + ME_09_06 + NEG_07_03 + OP_03_05 + OP_14_01 + OP_14_02 + ORL_01_03 + ORL_03_01 + PER_08_02 + PER_19_06 + PLA_05_01 + PLA_07_02 + PLA_10_01 + PLA_12_02 + PLA_18_01 + PR_06_02 + PR_15_03 + PR_25_01 + PR_25_06 + REL_14_06 + REL_16_04 + TF_04_01 + TF_10_03 + TRE_26_04 + TRE_26_05 [4] F3=~ + AUT_10_04 + AUN_07_01 + AUN_09_01 + AUN_10_01 + AUT_11_01 + AUT_17_01 + AUT_20_03 + CRE_10_01 + EFEC_03_07 + EFEC_05 + EFEC_16_03 + EVA_02_01 + EVA_07_01 + EVA_15_06 + FLX_04_01 + FLX_04_05 + FLX_10_03 + IDO_09_03 + IDO_17_01 + IE_01_03 + IE_10_03 + IE_13_03 + IE_15_01 + LC_07_03 + ME_09_01 + NEG_01_03 + NEG_05_04 + NEG_08_01 + OP_12_01 + ORL_03_05 + ORL_10_05 + PER_16_01 + PER_22_06 + PLA_01_03 + REL_09_05 + REL_14_03 + RS_02_03 + RS_07_05 + RS_08_05 + RS_13_03 + TF_03_01 + TF_12_01 + TRE_09_05 + TRE_09_06 > El jue., 29 ago. 2019 a las 14:29, Danilo Esteban Rodriguez Zapata (< danilo_rodrig...@cun.edu.co>) escribió: > Dear William, > > Thank you for your answer, I would like to add some information that I > just obtained looking in different sites and forums. Someone there ask me > to create a fake data file, so I did that from my original data file. What > I did was open the .csv file with notepad and replace all the 4 for 5 and > the 2 for 1, then I saved the file again with no other changes. I also > searched for the "~" in the file and I found nothing. Now with that file I > did the omegaSem() function and it worked succesfully, so the weird thing > here is that the omegaSem() function works with the fake data file, wich is > exactly the same as the original file, but recoding some answers as I said. > > It seems to be an issue with the file. When I replace, lets say, the 5 for > 6 and make the omegaSem() again, it works. Then I replace back again the 6 > for 5 in all the data and the function doesn't works anymore. > > El jue., 29 ago. 2019 a las 12:33, William Dunlap () > escribió: > >> > omegaSem(r9,n.obs=198) >> Error in parse(text = x, keep.source = FALSE) : >> :2:0: unexpected end of input >> >> This error probably comes from calling factor("~") and >> psych::omegaSem(data) will do that if all the columns in data are very >> highly correlated with one another. In that case omega(data, nfactor=n) >> will not be able to find n factors in the data but it returns "~" in place >> of the factors that it could not find. E.g., >> > fakeData <- data.frame(A=1/(1:40), B=1/(2:41), C=1/(3:42), D=1/(4:43), >> E=1/(5:44)) >> > cor(fakeData) >> A B C D E >> A 1.000 0.9782320 0.9481293 0.9215071 0.8988962 >> B 0.9782320 1.000 0.9932037 0.9811287 0.9684658 >> C 0.9481293 0.9932037 1.000 0.9969157 0.9906838 >> D 0.9215071 0.9811287 0.9969157 1.000 0.9983014 >> E 0.8988962 0.9684658 0.9906838 0.9983014 1.000 >> > psych::omegaSem(fakeData) >> Loading required namespace: lavaan >> Loading required namespace: GPArotation >> In factor.stats, I could not find the RMSEA upper bound . Sorry about that >> Error in parse(text = x, keep.source = FALSE) : >> :2:0: unexpected end of input >> 1: ~ >>^ >> In addition: Warning message: >> In cov2cor(t(w) %*% r %*% w) : >> diag(.) had 0 or NA entries; non-finite result is doubtful >> > psych::omega(fakeData)$model$lavaan >> In factor.stats, I could not find the RMSEA upper bound . Sorry about that >> [1] g =~ +A+B+C+D+E F1=~ + B + C + D + E F2=~ + A >> [4] F3=~ >> Warning message: >> In cov2cor(t(w) %*% r %*% w) : >> diag(.) had 0 or NA entries; non-finite result i
Re: [R] R code: How to correct "Error in parse(text = x, keep.source = FALSE)" output in psych package using own dataset
Thank you so much, I'll wait until then. The good thing is that we can make sure now what is the actual problem. I wish you have a good rest. El jue., 29 ago. 2019 a las 14:55, William R Revelle (< reve...@northwestern.edu>) escribió: > Hi all. > > I am taking a brief vacation and will look at this next week. > > Bill > > > > On Aug 29, 2019, at 2:53 PM, William Dunlap wrote: > > > > Element #2 of that output, the empty fomula " F1=~ ", triggers the bug > in omegaSem. > > omegaSem needs to ignore such entries in omega's output. psych's author > should be able to fix things up. > > > > Bill Dunlap > > TIBCO Software > > wdunlap tibco.com > > > > > > On Thu, Aug 29, 2019 at 12:31 PM Danilo Esteban Rodriguez Zapata < > danilo_rodrig...@cun.edu.co> wrote: > > well the output with the code that you refer is the following: > > > > > psych::omega(my.data)$model$lavaan > > [1] g =~ > +AUT_10_04+AUN_07_01+AUN_07_02+AUN_09_01+AUN_10_01+AUT_11_01+AUT_17_01+AUT_20_03+CRE_05_02+CRE_07_04+CRE_10_01+CRE_16_02+EFEC_03_07+EFEC_05+EFEC_09_02+EFEC_16_03+EVA_02_01+EVA_07_01+EVA_12_02+EVA_15_06+FLX_04_01+FLX_04_05+FLX_08_02+FLX_10_03+IDO_01_06+IDO_05_02+IDO_09_03+IDO_17_01+IE_01_03+IE_10_03+IE_13_03+IE_15_01+LC_07_03+LC_08_02+LC_11_03+LC_11_05+ME_02_03+ME_07_06+ME_09_01+ME_09_06+NEG_01_03+NEG_05_04+NEG_07_03+NEG_08_01+OP_03_05+OP_12_01+OP_14_01+OP_14_02+ORL_01_03+ORL_03_01+ORL_03_05+ORL_10_05+PER_08_02+PER_16_01+PER_19_06+PER_22_06+PLA_01_03+PLA_05_01+PLA_07_02+PLA_10_01+PLA_12_02+PLA_18_01+PR_06_02+PR_15_03+PR_25_01+PR_25_06+REL_09_05+REL_14_03+REL_14_06+REL_16_04+RS_02_03+RS_07_05+RS_08_05+RS_13_03+TF_03_01+TF_04_01+TF_10_03+TF_12_01+TRE_09_05+TRE_09_06+TRE_26_04+TRE_26_05 > > [2] F1=~ > > > > > > > > > > > > [3] F2=~ + AUN_07_02 + CRE_05_02 + CRE_07_04 + CRE_16_02 + EFEC_09_02 + > EVA_12_02 + FLX_08_02 + IDO_01_06 + IDO_05_02 + LC_08_02 + LC_11_03 + > LC_11_05 + ME_02_03 + ME_07_06 + ME_09_06 + NEG_07_03 + OP_03_05 + OP_14_01 > + OP_14_02 + ORL_01_03 + ORL_03_01 + PER_08_02 + PER_19_06 + PLA_05_01 + > PLA_07_02 + PLA_10_01 + PLA_12_02 + PLA_18_01 + PR_06_02 + PR_15_03 + > PR_25_01 + PR_25_06 + REL_14_06 + REL_16_04 + TF_04_01 + TF_10_03 + > TRE_26_04 + TRE_26_05 > > > > > > [4] F3=~ + AUT_10_04 + AUN_07_01 + AUN_09_01 + AUN_10_01 + AUT_11_01 + > AUT_17_01 + AUT_20_03 + CRE_10_01 + EFEC_03_07 + EFEC_05 + EFEC_16_03 + > EVA_02_01 + EVA_07_01 + EVA_15_06 + FLX_04_01 + FLX_04_05 + FLX_10_03 + > IDO_09_03 + IDO_17_01 + IE_01_03 + IE_10_03 + IE_13_03 + IE_15_01 + > LC_07_03 + ME_09_01 + NEG_01_03 + NEG_05_04 + NEG_08_01 + OP_12_01 + > ORL_03_05 + ORL_10_05 + PER_16_01 + PER_22_06 + PLA_01_03 + REL_09_05 + > REL_14_03 + RS_02_03 + RS_07_05 + RS_08_05 + RS_13_03 + TF_03_01 + TF_12_01 > + TRE_09_05 + TRE_09_06 > > > > > > > > > > El jue., 29 ago. 2019 a las 14:29, Danilo Esteban Rodriguez Zapata (< > danilo_rodrig...@cun.edu.co>) escribió: > > Dear William, > > > > Thank you for your answer, I would like to add some information that I > just obtained looking in different sites and forums. Someone there ask me > to create a fake data file, so I did that from my original data file. What > I did was open the .csv file with notepad and replace all the 4 for 5 and > the 2 for 1, then I saved the file again with no other changes. I also > searched for the "~" in the file and I found nothing. Now with that file I > did the omegaSem() function and it worked succesfully, so the weird thing > here is that the omegaSem() function works with the fake data file, wich is > exactly the same as the original file, but recoding some answers as I said. > > > > It seems to be an issue with the file. When I replace, lets say, the 5 > for 6 and make the omegaSem() again, it works. Then I replace back again > the 6 for 5 in all the data and the function doesn't works anymore. > > > > > > El jue., 29 ago. 2019 a las 12:33, William Dunlap () > escribió: > > > omegaSem(r9,n.obs=198) > > Error in parse(text = x, keep.source = FALSE) : > > :2:0: unexpected end of input > > > > This error probably comes from calling factor("~") and > psych::omegaSem(data) will do that if all the columns in data are very > highly correlated with one another. In that case omega(data, nfactor=n) > will not be able to find n factors in the data but it returns "~" in place > of the factors that it could not find. E.g., > > > fakeData <- data.frame(A=1/(1:40), B=1/(2:41), C=1/(3:42), D=1/(4:43), > E=1/(5:44)) > > > cor(fakeData) > > A B C