Skip to content

Commit

Permalink
Updating to CRAN version 2.2.4
Browse files Browse the repository at this point in the history
Bringing Github back in sync with CRAN versioning - my bad.
  • Loading branch information
killick committed Nov 8, 2022
1 parent 7aacc73 commit 04c67b4
Show file tree
Hide file tree
Showing 20 changed files with 674 additions and 419 deletions.
14 changes: 7 additions & 7 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,28 +1,28 @@
Package: changepoint
Type: Package
Title: Methods for Changepoint Detection
Version: 2.2.3
Date: 2022-03-08
Version: 2.2.4
Date: 2022-10-31
Authors@R: c(person("Rebecca", "Killick",
role=c("aut","cre"),email="[email protected]"),
person("Kaylea", "Haynes", role="aut"),
person("Idris", "Eckley", role=c("ths")),
person("Paul","Fearnhead",role=c("ctb","ths")), person("Jamie","Lee",role="ctr"))
person("Paul","Fearnhead",role=c("ctb","ths")), person("Robin", "Long", role="ctb"),person("Jamie","Lee",role="ctr"))
Maintainer: Rebecca Killick <[email protected]>
BugReports: https://github.com/rkillick/changepoint/issues
URL: https://github.com/rkillick/changepoint/
Description: Implements various mainstream and specialised changepoint methods for finding single and multiple changepoints within data. Many popular non-parametric and frequentist methods are included. The cpt.mean(), cpt.var(), cpt.meanvar() functions should be your first point of call.
Depends: R(>= 3.1), methods, stats, zoo(>= 0.9-1)
Suggests: testthat
Suggests: testthat, vdiffr
License: GPL
LazyLoad: yes
LazyData: true
Packaged: 2022-03-14 14:14:13 UTC; killick
Packaged: 2022-10-31 14:14:13 UTC; killick
NeedsCompilation: yes
Repository: CRAN
Date/Publication: 2022-03-14 15:50:02 UTC
Date/Publication: 2022-10-31 15:50:02 UTC
Author: Rebecca Killick [aut, cre],
Kaylea Haynes [aut],
Idris Eckley [ths],
Paul Fearnhead [ctb, ths],
Robin Long [ctb],
Jamie Lee [ctr]
63 changes: 9 additions & 54 deletions NEWS
Original file line number Diff line number Diff line change
@@ -1,7 +1,12 @@
Version 2.2.4
=============
* Updated C code to be compliant with new CRAN flags
* Updated check for 1D objects to also check for 1D arrays. Thanks to Github user ChristopherEeles for highlighting the bug.

Version 2.2.3
=============
* Updated documentation to reflect that the changepoint is the last observation of a segment, not the first observation of a new segment as previously described. Thanks to Alex Bertram for reporting this.
* Added fuctionality for cpt.range into cpts to allow specification of ncpts. Use as cpts(object,ncpts=X) to retrive the X changepoints segmentation from object which has a cpt.range class. Thanks to Craig Faunce for the suggestion.
* Added fuctionality for cpt.range into cpts to allow specification of ncpts. Use as cpts(object,ncpts=X) to retrive the X changepoints segmentation from an object which has cpt.range class. Thanks to Craig Faunce for the suggestion.
* Corrected parameter estimation for cpt.reg class so that sig2 contains the variance MLE per segment. This addresses a consistency issue with lm when evaluating logLik when no changepoints are found. Thanks to Simon Taylor for spotting this.
* Correction bug in parameter estimation for cpt.var with cpt.range class when ncpts!=NA. Previously this was using seg.len() function which returned NA for CROPS and incorrect segments lengths for BinSeg. Thanks to David Krider for asking a question which led to this bug being identified.
* Added "type=" functionality to the "diagnostic=TRUE" graphs. Thanks go to github user hoxo_m for providing the code and tests for this.
Expand Down Expand Up @@ -93,81 +98,57 @@ Version 2.0

Version 1.1.5
=============
Changes to previous version

* Modified citation information

Version 1.1.3
=============
Changes to previous version

* Upgraded zoo from "imports" to "depends". This is needed because more functionality from zoo is required and this also gives users access to the zoo features directly whereas previously they had to library the package separately.
* The segment neighbourhood algorithm has now been updated to make sure that the changes in variance or mean and variance do not return segments of length 1. This is now in line with the PELT and BinSeg algorithms.
* Added citation information.
* Documentation pages for cpt.mean, cpt.meanvar and cpt.var were updated to be clearer. Thanks to Toby Hocking for highlighting this.

Version 1.1.2
=============
Changes to previous version

* The check introduced in version 1.1.1 was causing problems in older versions as there was a mask between S3 and S4 generics. Thus we removed the masking effect and incremented the minimum version of R to 3.0. Thanks go to Philipp Boersch-Supan for highlighting the bug.
* Added Kaylea Haynes as author of package.

Version 1.1.1
=============
Changes to previous version

* Added a check on the generics used to check if the generics "coef" and "logLik" exist and if not to create them. This was creating problems with backwards R compatibility as "coef" and "logLik" were only introduced as generics in recent versions of R.

Version 1.1
===========
Changes to previous version

* Adding printing of p-value to single changepoint likelihood-based functions. Thanks go to anonymous reviewers for this suggestion.
* Changed all Segment Neighbourhood and Binary Segmentation functions to print a warning if the number of changes/segments identified equals Q.

Version 1.0.6
=============
Changes to previous version

* Fixed bug where the likelihood & logLik generics returns an error when there are no changes. This was due to using the cpts generic which no longer returns the length of the data as a change (since version 1.0). Thanks go to Gia-Thuy Pham for highlighting this bug.
* Corrected calculation of segment scale parameter estimates for Gamma models (used in cpt.meanvar(...,test.stat='Gamma')). Previously this was dividing by seg.length+1 when it should be dividing by seg.length. Thanks go to Martyn Byng for highlighting the error.

Version 1.0.5
=============
Changes to previous version

* Added journal references to PELT documentation. Thanks go to Martyn Byng for pointing out that this required updating.
* Added checks on minimum lengths of data input (minimum for mean is 1 per segment, for variance / mean & variance it is 2 observations per segment). Thanks go to Toby Hocking for highlighting the error.

Version 1.0.4
=============
Changes to previous version

* Changed all the Binary Segmentation multiple changepoint algorithms to have a default oldmax parameter of Inf. The old default (1000) was causing problems when users were wanting to do elbow plots for penalty choice. Thanks go to Brian Jessop for highlighting the problem. Note: There was no problem with the changepoints returned by the previous code. This change purely relates to elbow plot penalty choice.

Version 1.0.3
=============
Changes to previous version

* Changed a few commands where na.rm=F to na.rm=T. This was causing problems when users entered constant data as some test statistics could produce NAs. Thanks go to Harutyun Khachatryan for pointing out this bug.

Version 1.0.2
=============
Changes to previous version

* Changed argument name in cpt.mean, cpt.var and cpt.meanvar. Previously was named 'dist', now named 'test.stat'. Not backwards compatible. This is in line with the major changes in version 1.0 to the class structure.

Version 1.0.1
=============
Changes to previous version

* Corrected naming of columns returned by logLik and likelihood generics. These return the scaled negative likelihood and scaled negative likelihood + penalty.

Version 1.0
===========
Changes to previous version

* Removed printing and plotting of the last observation as the last cpt
* The distribution class slot for the cpt class has been rename to test.stat (backwards compatibility remains - for now)
* Changed value parameter in all functions to be pen.value (no backwards compatibility for named variables but ordering of arguments remains the same)
Expand All @@ -181,61 +162,43 @@ Changes to previous version

Version 0.8
===========
Changes to previous version

* Restructured C implementation of PELT to remove unnecessary function calls (minor speed improvement)
* Updated FTSE100 data
* Added Lai2005fig3 and Lai2005fig4 data

Version 0.7.1
=============
Changes to previous version

* added BIC1, SIC1, AIC1, Hannan-Quinn1 penalties (counting the changepoint as a parameter in contrast to BIC, SIC, AIC, Hannan-Quinn which do not)

Version 0.7
===========
Changes to previous version

* added change in Poisson distribution to cpt.meanvar

Version 0.6.1
=============
Changes to previous version

* removed plotting of "n" as a changepoint for changes in variance
* changed array allocation to calloc in PELT C code
* added checks to C code for user interruption (including appropriate memory handling)

Version 0.6
===========
Changes to previous version

* PELT algorithms now run using C code
* Binary Segmentation algorithms now run using C code

Version 0.5
===========
Changes to previous version

* added namespace to comply with R 2.13

Version 0.4.2
=============
Changes to previous version

* corrected mismatch in default values for mu

Version 0.4.1
=============
Changes to previous version

* changed default value of the mu parameter from -1000 to NA

Version 0.4
===========
Changes to previous version

* removed regression for Normal distributed errors
* added cusum for change in mean
* added cusum of squares for change in variance
Expand All @@ -244,26 +207,18 @@ Changes to previous version

Version 0.3
===========
Changes to previous version

* added change in regression for Normal distributed errors

Version 0.2.1
=============
Changes to previous version

* fixed multiple optimal changepoint bug in multiple algorithms

Version 0.2
===========
Changes to previous version

* added change in scale parameter of Gamma distribution to cpt.meanvar
* added change in Exponential distribution to cpt.meanvar
* added ncpts function to cpt and cpt.reg classes

Version 0.1: Original

Rebecca Killick
Lancaster University
www.lancs.ac.uk/~killick
Version 0.1
===========
* Original
47 changes: 39 additions & 8 deletions R/cpt.class.R
Original file line number Diff line number Diff line change
Expand Up @@ -142,15 +142,27 @@
if (is.function("cpts")){
fun <- cpts
}
else {fun <- function(object){
else {fun <- function(object,...){
standardGeneric("cpts")
}
}
setGeneric("cpts", fun)
}
setMethod("cpts","cpt",function(object) object@cpts[-length(object@cpts)])
setMethod("cpts","cpt.reg",function(object) object@cpts[-length(object@cpts)])
setMethod("cpts","cpt.range",function(object,ncpts=NA){
if(is.na(ncpts)){return(object@cpts[-length(object@cpts)])}
else{
ncpts.full=apply(cpts.full(object),1,function(x){sum(x>0,na.rm=TRUE)})
row=try(which(ncpts.full==ncpts),silent=TRUE)
if(inherits(row,'try-error')){
stop("Your input object doesn't have a segmentation with the requested number of changepoints.\n Possible ncpts are: ",paste(ncpts.full,collapse=','))
}
else{return(cpts.full(object)[row,])}
}
})


if(!isGeneric("cpts.full")) {
if (is.function("cpts.full")){
fun <- cpts.full
Expand Down Expand Up @@ -346,6 +358,16 @@
else{ object@cpts <- c(value,n) }
return(object)
})
setReplaceMethod("cpts", "cpt.range", function(object, value) {
if((cpttype(object)=="meanar")|(cpttype(object)=="trendar")){
n=length(object@data.set)-1
}
else{n=length(object@data.set)}

if(value[length(value)]==n){object@cpts <- value}
else{ object@cpts <- c(value,n) }
return(object)
})
setReplaceMethod("cpts", "cpt.reg", function(object, value) {
if(value[length(value)]==nrow(object@data.set)){object@cpts <- value}
else{ object@cpts <- c(value,nrow(object@data.set)) }
Expand Down Expand Up @@ -547,7 +569,7 @@
param.var=function(object,cpts){
nseg=length(cpts)-1
data=data.set(object)
seglen=seg.len(object)
seglen=cpts[-1]-cpts[-length(cpts)]
tmpvar=NULL
for(j in 1:nseg){
tmpvar[j]=var(data[(cpts[j]+1):(cpts[j+1])])
Expand Down Expand Up @@ -578,8 +600,8 @@
thetaT=(6*cptsumstat[,2])/((seglen+1)*(2*seglen+1)) + (thetaS * (1-((3*seglen)/((2*seglen)+1))))
return(cbind(thetaS,thetaT))
}
param.meanar=function(object){
seglen=seg.len(object)
param.meanar=function(object,cpts){
seglen=cpts[-1]-cpts[-length(cpts)]
data=data.set(object)
n=length(data)-1
sumstat=cbind(cumsum(c(0,data[-1])),cumsum(c(0,data[-(n+1)])),cumsum(c(0,data[-1]*data[-(n+1)])),cumsum(c(0,data[-1]^2)),cumsum(c(0,data[-(n+1)]^2)))
Expand All @@ -589,8 +611,8 @@

return(cbind(beta1,beta2))
}
param.trendar=function(object){
seglen=seg.len(object)
param.trendar=function(object,cpts){
seglen=cpts[-1]-cpts[-length(cpts)]
data=data.set(object)
n=length(data)-1
sumstat=cbind(cumsum(c(0,data[-1])),cumsum(c(0,data[-(n+1)])),cumsum(c(0,data[-1]*data[-(n+1)])),cumsum(c(0,data[-1]*c(1:n))),cumsum(c(0,data[-(n+1)]*c(0:(n-1)))))
Expand Down Expand Up @@ -685,7 +707,7 @@
}
tmpfit=eval(parse(text=paste('lm(data[',(cpts[j]+1),':',cpts[j+1],',1]~',formula,')',sep='')))
tmpbeta[j,]=tmpfit$coefficients
tmpsigma[j]=var(tmpfit$residuals)
tmpsigma[j]=sum(tmpfit$residuals^2)/(length(tmpfit$residuals)-length(tmpfit$coefficients)) ##var(tmpfit$residuals)
}
return(list(beta=tmpbeta,sig2=tmpsigma))
}
Expand Down Expand Up @@ -805,7 +827,15 @@

setMethod("plot","cpt.range",function(x,ncpts=NA,diagnostic=FALSE,cpt.col='red',cpt.width=1,cpt.style=1,...){
if(diagnostic==TRUE){
return(plot(apply(cpts.full(x),1,function(x){sum(x>0,na.rm=TRUE)}),pen.value.full(x),type='l',xlab='Number of Changepoints',ylab='Penalty Value',...))
n.changepoints = apply(cpts.full(x), 1, function(x) sum(x > 0, na.rm = TRUE))
penalty.values = pen.value.full(x)
if (is.null(list(...)$type)) {
# By default, the type of the diagnostic plots is "lines".
plot(x = n.changepoints, y = penalty.values, xlab = 'Number of Changepoints', ylab = 'Penalty Value', type = "l", ...)
} else {
plot(x = n.changepoints, y = penalty.values, xlab = 'Number of Changepoints', ylab = 'Penalty Value', ...)
}
return(invisible(NULL))
}
plot(data.set.ts(x),...)
if(is.na(ncpts)){
Expand Down Expand Up @@ -891,6 +921,7 @@
rss=sum((data.set(object)-means)^2)
n=length(data.set(object))
like=n*(log(2*pi)+log(rss/n)+1) # -2*loglik
cpts=c(0,object@cpts)
if(pen.type(object)=="MBIC"){
like=c(like, like+(nseg(object)-2)*pen.value(object)+sum(log(cpts[-1]-cpts[-(nseg(object)+1)])))
}else{
Expand Down
10 changes: 5 additions & 5 deletions R/exp.R
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ single.meanvar.exp.calc <-
}
}

if(is.null(dim(data))==TRUE){
if(is.null(dim(data))==TRUE || length(dim(data)) == 1){
# single data set
cpt=singledim(data,extrainf,minseglen)
return(cpt)
Expand Down Expand Up @@ -48,7 +48,7 @@ single.meanvar.exp.calc <-

single.meanvar.exp<-function(data,penalty="MBIC",pen.value=0,class=TRUE,param.estimates=TRUE,minseglen){
if(sum(data<0)>0){stop('Exponential test statistic requires positive data')}
if(is.null(dim(data))==TRUE){
if(is.null(dim(data))==TRUE || length(dim(data)) == 1){
# single dataset
n=length(data)
}
Expand All @@ -59,7 +59,7 @@ single.meanvar.exp<-function(data,penalty="MBIC",pen.value=0,class=TRUE,param.es
if(n<(2*minseglen)){stop('Minimum segment legnth is too large to include a change in this data')}

penalty_decision(penalty, pen.value, n, diffparam=1, asymcheck="meanvar.exp", method="AMOC")
if(is.null(dim(data))==TRUE){
if(is.null(dim(data))==TRUE || length(dim(data)) == 1){
tmp=single.meanvar.exp.calc(coredata(data),extrainf=TRUE,minseglen)
if(penalty=="MBIC"){
tmp[3]=tmp[3]+log(tmp[1])+log(n-tmp[1]+1)
Expand Down Expand Up @@ -267,7 +267,7 @@ multiple.meanvar.exp=function(data,mul.method="PELT",penalty="MBIC",pen.value=0,
}

diffparam=1
if(is.null(dim(data))==TRUE){
if(is.null(dim(data))==TRUE || length(dim(data)) == 1){
# single dataset
n=length(data)
}
Expand All @@ -278,7 +278,7 @@ multiple.meanvar.exp=function(data,mul.method="PELT",penalty="MBIC",pen.value=0,

pen.value = penalty_decision(penalty, pen.value, n, diffparam=1, asymcheck=costfunc, method=mul.method)

if(is.null(dim(data))==TRUE){
if(is.null(dim(data))==TRUE || length(dim(data)) == 1){
# single dataset
out = data_input(data=data,method=mul.method,pen.value=pen.value,costfunc=costfunc,minseglen=minseglen,Q=Q)

Expand Down
Loading

0 comments on commit 04c67b4

Please sign in to comment.