y = as.numeric(as.matrix(data[,match(outcome,colnames(data))]))

This change is necessary when the outcome is defined as a factor or TRUE/FALSE variable. Also, make sure Y is either boolean or a 0/1 variable.

The example below then works for me:

set.seed(12) x = array(rnorm(4000,0,1),dim=c(1000,4)) y = rbinom(1000,1,( 1/(1+exp(-(-3.2 + 0.5*x[,1]+1.2*x[,2]-0.7*x[,3]+0.8*x[,4]))))) mydata = as.data.frame(cbind(x,y)) colnames(mydata)=c("A1","A2","B1","B2","Y") mydata$Y = as.factor(mydata$Y) fmla=as.formula("Y ~ (A1 * A2) + (B1 * B2)") model=mle.logreg.constrained(fmla,mydata) model

This yields the following results for the regression coefficients:

(Intercept): -2.646302771

A1: 0.351304056

A2: 1.007674551

B1: 0.000000000

B1: 0.608490582

A1:A2 0.001191156

B1:B2 0.000000000

Hope it works for you now!

]]>Sorry for that.

]]>However, models as Y ~ (A1 * A2) + (B1 * B2) trigger an error, whereas Y ~ (A1 : A2) + (B1 : B2) passes. If I reformulate Y ~ (A1 * A2) + (B1 * B2) to Y ~ A1 + A2 + (A1 * A2) + B1 + B2 + (B1 * B2), coefficients can still be negative.

Can you point me to my mistake or give a hint on the solution? Would be very nice

]]>To delete a comment, just log in, and view the posts’ comments, there you will have the option to edit or delete them. ]]>