Aws Lambda, Python, Numpy And Others As Layers
Solution 1:
This is probably not the answer you want to hear, but honestly the pain around getting certain compiled libraries into lambda layers was enough for my company to just stop using them. Instead we use tend to use either fargate or ECS with docker containers.
Besides the issues of compiling packages for lambdas, we also ran into major issues with the max size of lambdas. We regularly were hitting that cap and having to get more and more hacky to remove files in order to make them fit.
Update: AWS now lets you run Lambdas from containers in ECR, which solves this problem nicely.
Solution 2:
You should not need to recompile the layer every time you deploy. We have a lambda layer specifically for ML libraries like numpy, pandas, and fbprophet. It works great b/c our lambda deployment zip files are tiny, speeding up development and deployment.
I'm happy to help further. Can you give more information about what you tried and what was going wrong?
Solution 3:
In case anyone else stumbles across this post, there are now pre-built layers in AWS that you can access:
https://github.com/keithrozario/Klayers
For those of you who want to make your own (like I did) this one comment in the docker packaging script of that repo solved my problem: python (build with python3.6 or python3.7 not python3)
That comment clicked in for me, build with the version of python you're using in AWS. In my original Makefile for the lambda layer, I was using python3 to build the packages, and that would always give me the same error when I tried running the lambda:
Runtime.ImportModuleError Unable to import module 'function' .... Original error was: No module named 'numpy.core._multiarray_umath'
However when I switched to this:
python3.8 -m pip install -r requirements.txt -t "$(ARTIFACTS_DIR)/python"
which matches my AWS Lambda Runtime. I was able to run numpy, pandas, and openpyxl without issue.
Post a Comment for "Aws Lambda, Python, Numpy And Others As Layers"