Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scala apps deploy fail #18

Open
UlTriX opened this issue Mar 5, 2020 · 8 comments
Open

Scala apps deploy fail #18

UlTriX opened this issue Mar 5, 2020 · 8 comments
Assignees

Comments

@UlTriX
Copy link

UlTriX commented Mar 5, 2020

Trying to deploy a scala app using buildpack (builder v2.13.3 and slugbuilder v2.7.3) but it fails when the buildpack tries to write in the base dir (access denied).

Tried the provided buildpacks and custom pointing to latest version same issue.

Line in question that fails of the buildpack
cat << EOF > "${BASE_DIR}/export"

Seems like an issue with folder permission and the user that runs the buildpack.

Any ideas of what is happening and some possible solutions?

Thanks in advance.

@Cryptophobia Cryptophobia self-assigned this Mar 5, 2020
@Cryptophobia
Copy link
Member

Can you provide a complete log of slugbuilder pod after it terminates?

I suspect it has to do with permissions applied to the slugbuilder user or sec profile applied to the pod. What version of kubernetes are you running Hephy on?

@UlTriX
Copy link
Author

UlTriX commented Mar 5, 2020

This is the last lines of log of the slugbuilder pod before it terminates

�[1G [success] Total time: 57 s, completed Mar 5, 2020 4:06:02 PM
�[1G [info] Wrote /tmp/scala_buildpack_build_dir/target/scala-2.11/play-getting-started_2.11-1.0-SNAPSHOT.pom
�[1G [info] Packaging /tmp/scala_buildpack_build_dir/target/scala-2.11/play-getting-started_2.11-1.0-SNAPSHOT.jar ...
�[1G [info] Done packaging.
�[1G [info] Packaging /tmp/scala_buildpack_build_dir/target/scala-2.11/play-getting-started_2.11-1.0-SNAPSHOT-web-assets.jar ...
�[1G [info] Done packaging.
�[1G [info] Packaging /tmp/scala_buildpack_build_dir/target/scala-2.11/play-getting-started_2.11-1.0-SNAPSHOT-sans-externalized.jar ...
�[1G [info] Done packaging.
�[1G [success] Total time: 1 s, completed Mar 5, 2020 4:06:02 PM
�[1G-----> Dropping ivy cache from the slug
�[1G-----> Dropping sbt boot dir from the slug
�[1G-----> Dropping compilation artifacts from the slug
/tmp/buildpacks/12-scala/bin/compile: line 214: //export: Permission denied

Same error is returned in two Kubernetes clusters I control version 1.15.10 and 1.15.5

@Cryptophobia
Copy link
Member

Is there already a file/folder called export in the repo you are building?

@Cryptophobia
Copy link
Member

This seems to be a step in the buildpack that is very specific to Heroku. You can also just clone the scala buildpack repo and remove those last few lines. Then set BUILDPACK_URL to your cloned version.

@Cryptophobia
Copy link
Member

They added this export functionality here in the scale buildpack and I am not sure why? heroku/heroku-buildpack-scala#135

@UlTriX
Copy link
Author

UlTriX commented Mar 5, 2020

Is there already a file/folder called export in the repo you are building?

no there is none

This seems to be a step in the buildpack that is very specific to Heroku. You can also just clone the scala buildpack repo and remove those last few lines. Then set BUILDPACK_URL to your cloned version.

Thank you for idea seems a good workaround. Will create my custom buildpack for now.

They added this export functionality here in the scale buildpack and I am not sure why? heroku/heroku-buildpack-scala#135

Indeed. It seems to me is regarding running different buildpacks over the same app perhaps?

This means building scala apps on latest workflow is broken since the default buildpack points to one that is after that change. Perhaps custom buildpacks are needed for workflow? Is there any way of running the build as root?

Thank you for all the help.

@Cryptophobia
Copy link
Member

Cryptophobia commented Mar 5, 2020

Another way to try to solve this would be to try creating a psp (pod security policy) for the deis namespace:

kubectl get psp --all-namespaces
NAME             PRIV   CAPS   SELINUX    RUNASUSER   FSGROUP    SUPGROUP   READONLYROOTFS   VOLUMES
deis.privileged   true   *      RunAsAny   RunAsAny    RunAsAny   RunAsAny   false            *

https://docs.bitnami.com/kubernetes/how-to/secure-kubernetes-cluster-psp/

Not sure if it will solve it though. This would mean that minikube or wherever you are running k8s is restricting the pods in some way.

@UlTriX , are you running minikube with a vm driver set?

@UlTriX
Copy link
Author

UlTriX commented Mar 6, 2020

I am going to try to mess around with the security policies.

I am not running in minikube. I tested in two k8s clusters installed in bare metal (one fresh install) and also on local Docker Desktop (k8s activated).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants