Replies: 2 comments 7 replies
-
Yes, min-max normalization for inputs to |
Beta Was this translation helpful? Give feedback.
-
Thanks for the link, I've implemented these transforms myself because I didn't know they were already implemented in BoTorch. Is there a reason why only standardization is recommended though? I've only seen standardization done in the GPyTorch docs and in github issues here. Are default hyperparameter values better suited for standardization or is it something else that makes standardization better for GPs (for both inputs and outputs)? |
Beta Was this translation helpful? Give feedback.
-
Hello,
I am curious as to why standardization is recommended for both inputs and outputs for GP regression over other scaling methods such as min-max normalization. If training outputs are standardized, they have zero mean, so a zero mean GP could be used, after which test predictions can be inverse transformed (using the mean and standard deviation of the training data) to get the "true" predicted values. For inputs, it makes sense to scale the data in some way to remove the units associated with each feature, but both standardization and min-max normalization make the data unitless.
Could you use other methods such as min-max normalization to scale the data instead? For example, could you use min-max normalization for the inputs and standardize the outputs? Would using a scaling method other than standardization impact model performance significantly? I have not found many discussions on scaling data for GPs specifically, the best one I've found is this: https://stats.stackexchange.com/q/178245/317677, but it only covers standardization.
Any discussion on data scaling for GPs would be greatly appreciated, thanks!
Beta Was this translation helpful? Give feedback.
All reactions