-
Notifications
You must be signed in to change notification settings - Fork 180
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Updater script, asset files and also spark.binproto file added. #448
base: master
Are you sure you want to change the base?
Conversation
Hey @VickyTheViking, thanks for your contribution! I'm reviewing your plugin but I've found that it's not working properly. Specifically, it looks like the fingerprinting for the main Here are the issues to fix:
Feel free to reach out. ~ Savio (Doyensec) |
Hi @lokiuox thank you for review. I fixed some items you have told. but for the permission error you mentioned I searched for the best way to fix, the best way is what you did before setting Spark image does not have python but it has some java files which they can run python. for example I can run Fibonacci example with this command: docker exec -d spark-master /opt/spark/bin/spark-submit --master spark://spark-master:7077 /opt/spark/examples/src/main/python/fib.py In this example we run the fib.py example with spark-submit. |
Hey @VickyTheViking, thanks for the update, you still have to address the following issues:
This is what I get when I try to manually reproduce the workflow and I launch the docker exec command:
|
Hi @VickyTheViking, are you still interested in contributing to Tsunami with this plugin? |
@lokiuox Hi, sorry for being late. I think I can finish this plugin soon. so yes I am still interested in contributing Tsunami. please give some time to do this. thanks. |
…f docker network use python3 included containers, otherwise install python3, python3-pip and pyspark python package
@lokiuox Hii
done
done I found a bug that made me change the hostname of the spark-master to 127.0.0.1. with this update, I haven't seen any error messages so far. |
Hi, dear tsunami team.
Apache Spark has different Web UI based on the way which it is ran. After hours of search I found a way to run it in the way that I can access all web UIs. So in this pull request we have:
1- Master web UI
2- Worker web UI
3- Web interface (Runs only when a SparkContext is running)
Which are extracted in one run of each version . I used apache/spark as base docker image, because it covered more versions than the official _/spark docker repo which naturally do not differ from each other. and versions without docker image were ignored.