I just joined a new team (very small: four developers in total, with two of those leaving soon). The two original developers set up the git repo on a folder in a Windows network share.
Am I taking crazy pills, or is that a bad idea? Our organization does have github/gitlab/bitbucket available, so is there any good reason not to use those hosted solutions?
Do you mean a bare repo you use as a remote and push/pull from or using the workdir straight from the share. The first would be strange, but kinda usable (don’t do this though), the latter is oh my god, get an other job category.
Working from the network share - I’ve worked on a project like this before, it was awful for developer experience. It took seconds to run a
git status
, it was so slow. Occasionally git would lock itself out of being able to update files, since the network read/write times were so slow. Large merges were impossible.The reason it was set up like this was that the CEO had the network share we were working off of set up to serve to a subdomain (so like, Bob’s work would be at bob.example.com), and it would update live as most hot-reloads will do. He wanted to be able to spy on developers and see what they were doing at any given time.
I have a lot of programming horror stories from that job. The title of this thread brought up a lot of repressed memories.
Yes, it’s definitely the former case, thankfully. Agreed that it’s strange, but it’s hard to put a technical reason behind it if I decide to push for hosting it somewhere better.
Why not just host a server in house?
Still a better version control than
20210131_v0.2_test2_john
20210131_v0.2_test2_john_final_final2_final_real
Are you concerned about corruption due to multiple users? Are you using the repo in the intended way? Then it’s fine. Git has locking mechanisms. Pull, work, commit, push.
I can’t exactly put my finger on it, but something feels off. For example, on my first day, I wasn’t able to view the files in Windows Explorer (or clone the repo, actually), so the other dev just gave me a zip file of the repo. There’s something fishy going on, and I’m trying to figure it out.
Since it’s on a network share, there’s the extra overhead of managing the file system permissions. And you probably hadn’t received access at the point.
That probably is the case, but in my mind I’m also questioning if they’re backing it up regularly, what prevents someone from going in and deleting the files, etc.
Sure, let’s hope they have a backup policy in place for best practice. But also it is kinda decentralized anyway. Every dev is going to have their local repo, and that is essentially a backup.
Our organization does have github/gitlab/bitbucket available
Do you mean “cloud services”? Maybe your colleagues don’t want them there.
For PCI-DSS relevant code, we only use internal systems.
I don’t see how would this be compliant with literally anything.
It’s actually fine, as long as you coordinate with them.
They offer services that cater to just about any compliance need, including things as annoying as fedramp.I would have to agree on this, it seems rather odd if the code repo is confidential or classified to be shared on a Windows Share. The reason why we would use Git services (self-hosted) is so that we have multitude of security services/layers maintained by dedicated team of system administrators such as firewall, service update, data redundancy, backup, active directory and so forth.
I can see a scenario where people accidentally put classified repos or information that aren’t supposed to be shared on Windows Share where unauthorized users could view that repos.
That may be the case, but the original engineers have made other highly questionable decisions: the backend service was written in Java 8…just last year!
That doesn’t sound questionable, but somewhere between stubborn and stupid. Unless that thing is supposed to be deployed to a heavily outdated system where nothing newer than Java8 will run, that is.
Do they have an agreement with GitHub/gitlab/bitbucket? Using their consumer targeted services as a business is just asking for trouble.
Storing a headless repository on a shared filesystem is a perfectly valid solution, and something git was designed to do, if you don’t mind the lack of online interface. Although I’d personally prefer using an ssh connection.
This is actually a very large government agency, with many internal as well as external projects hosted on those services, in the public instances as well as our own internal hosted instances of those services. But as long as there’s no glaring issues with it, and it’s a generally acceptable practice, then I’m fine with it as it doesn’t really affect my day to day use via command line.
You can run a private instance of GitLab without issues. Also, there is more OpenSource and free git repository services beyond GitHub and Gitlab, that you can install and run locally in a small server.
That’s definitely possible, but that doesn’t change the fact that you can make do without
It is fine. But for your personal growth and some more peace of mind, you should migrate the repository to one of the git services of your organization.
Personal growth because you will be able to use feature like branch policies, CICD pipelines or an integrated work item board.
Peace of mind because that network share is less likely to be recoverable than a self hosted gitlab instance for example.
The migration process also only take less than 5 min so there are more advabtages than disadvantages to do so.
The big difference there seems to be cloud, ie someone else’s computer, and company internal network. That is a big and fundamental choice: Will you allow (leak) your data outside of your control.
Doing anything with Windows shares is a bad idea technically, of course. But with git, every workspace is generally a full copy of the repository. All you need a shared one for is syncing work. And backups are always a sensible thing to have no matter how you arrange files. With that, it seems to be a low risk thing to me.
A bundle might or might not be safer than a repo, but either will probably work.
Reading some of the comment responses, it does sound like there’s an air of ineptitude or a long history of systems not working and devs not allowed to make them work, so they just try workarounds for everything.
my team had issues when IT accidentally changed permissions on the files inside a bare git repo located on a file-share. Otherwise it works okay as people clone and work locally. Not the best solution but we’re working around restrictions that makes this the easiest thing to do
To be honest I’d start by asking them why it’s set up like that as diplomatically as possible. This might be a bad solution, but what pushed them to adopt it nevertheless might be an organizational peculiarity you don’t want to find the hard way.
Do you git clone from the windows share or do all just use the same share as the working tree?
With github, gitlab, etc etc, you get much more than just a bare git repo store.
My suggestion is to look at the features of each hosting solution and present those features to the team, discuss the pros and cons, and let people decide what they feel is best for the team.
Remember, you’re not proposing a simple version control system. You’re picking one very important part of an internal development platform: issue tracking, version control, code quality, security scans, developer workflow helper apps, documentation/wiki, build systems, etc etc. Depending on what you already have in the roles other than version control may inform your decision on where to host your git repo.
I don’t think this is too bad, but the question here is why they set it up this way. Are there any restrictions like no SSH? Also, this would make it hard to clone from an off site location (for remote work).