Hello experienced programmers, I ask for advice regarding my workflow.
I'm writing a lot of small programs to use on a server in order to execute several operations and calculations.
My workflow is to simply open the server, open vim and write the code; then store it directly in a stow directory if it's python or its compiled binary if it's something else.
The more I do this the more I realize this is unsustainable:
- I'm not using git, which is problematic as I often have to come back and make some adjustments.
- I cannot use a different IDE, which I would prefer.
- I don't really have proper personal backups.
The advantage of this, is that I just have to write the code and I can immediately use it.
What I would like is: write the code on my personal computer and seamlessly have the software available in my server ~/.local/bin to be run.
I do not want to have to run rsync 20 times in order to do this.
Do you know how I could set up my system in order to achieve this?
Thank you.
Elena ``of Valhalla''
in reply to rastinza • •@rastinza I would put ~/.local/bin under git, push it somewhere on the local network, and then edit and commit it on the PC and then pull it from the servers when I need it.
However, that looks pretty close to rsync 20 times, if you have 20 servers, so maybe it's not what you're looking for?
rastinza
in reply to Elena ``of Valhalla'' • • •- I have a bunch of stuff in my ~/.local/bin, many of which are binaries and I often add or remove stuff from there. Most of that stuff is not developed by me, thus I don't want that to be in my git repository, moreover this would include whichever libraries I'm using/local environments in my bin directory.
And then, of course I would have to keep committing and pulling every time I edit something.
Elena ``of Valhalla'' likes this.