Your code looks fine, but converting to radians before subtracting isn’t necessary. They’re just units… Doing math in miles and converting to km is the same as converting to km and then doing your math.

]]>Hi Kurt!

output is in miles, right?

Is there a slick way to use tmerge to start from a single data frame, with your customer and event data frames merged by ID, and an additional column with support time 100 for subject 3, and NA in this column for other participants? I’m trying to build an example for a class I’m teaching based on the Emerson leukemia data set in Case Studies in Biometry,

at http://ftp.uni-bayreuth.de/math/statlib/datasets/csb/ch14.dat . There’s a column column for last followup and an indicator for death or censoring, and another column for bone marrow tranplant time, and an indicator for whether the transplant occurred. I want to estimate the time-dependent effect of transplant. If I follow your model, I should first build a data frame analogous to your support, with the time 0 for everyone in the data set, and indicator zero, an then rbind it to another data set for those who had the transplant, with transplant time and indicator 1. Is this the best way to do it? Any suggestions would be appreciated.

My college during code review pointed out to me, that we have to subtract cords in radiant format instead of pure cords.

Please take a look below whether you agree that our approach is correct.

Thank you

Yours

2 * 3961 * asin(sqrt((sin(radians((lat2 – lat1) / 2))) ^ 2 + cos(radians(lat1)) * cos(radians(lat2)) * (sin(radians((lon2 – lon1) / 2))) ^ 2))

Ours

2 * 3961 * asin(sqrt((sin((radians(lat2)-radians(lat1) / 2)^2) + cos(radians(lat1)) * cos(radians(lat2)) * (sin((radians(lon2) – radians(lon1)) / 2 )^2) ))

Hello Dayne,

Referring to the your first paragraph above, I’d say that the probability that the user churns in the next 30 days is (S(X) – S(X+30)) / S(X).

Thank you for your great help. I’ve been learning a lot from your stories.

Zbynek

I am looking to export the unsampled data unique users going back to jan 1st of 2010.

Would this be possible and are you available for consulting?

]]>same here and the problem is in the training set too,

]]>