• HaraldvonBlauzahn@feddit.orgOP
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    31 minutes ago

    Just two nitpicks:

    1. On the blog article

    I gave the tutorial a quick look, but it didn’t showcase any real benefits for my workflow. It confirmed my bias: this was for people who were afraid of Git’s power, not for those who had already mastered it.

    Well, it is the tutorial by Steve Klabnik, one of the co-authors of one of the two best and most comprehensive books on Rust. And as I anticipated because of that, it is a concise and lucid write-up and probably for everyone who enjoys good technical writing a refreshing read.

    1. On jujutsu

    I have been trying it both at home, and at work, for some months and it worked very well for me (for work I was using ‘stealth mode’, nobody was knowing I was using jujutsu).

    It is in fact simpler and at the same time more powerful. The only area where I had hickups is pushing to a git repo:

    • When the repo has more complex access modes, like ssh + vpn + git://…, it is better to use git directly. In new versions of jujutsu, this is built-in.

    • When one works on several machines in parallel (my typical use case here is couch-testing something on my laptop after the day), the git repo does not contain the on-going jujutsu changes. This leads to either conflicted changes or one has to do regular git force pushes. When I think about it, it is possibly better to just rsync the jujutsu repo (jujutsu does support that because it version-controls the metadata, one however has to be careful not to create backup copies of git metadata).

    • Also, jujutsu will readily change private history. As a counterbalance it has some configuration settings which protect public history from changing - the defaults are good but the settings still might be worth to have a look at.

    Of course, Torvalds’ essential rules on public and private history still apply. (see also this article by Jonathan Corbet on rebase/ merge flows which is, I think, really good advice for larger orgs).

  • FizzyOrange@programming.dev
    link
    fedilink
    arrow-up
    20
    ·
    11 hours ago

    Tbh these aren’t things that are big issues with Git. The biggest issues I have are:

    • Storing large files. LFS is a shitty hack that barely works.
    • Integrating other repos. Git submodules are a buggy hack, and Git subtree is… better… but still a hack that adds its own flaws.

    Fix those and it will take over Git in a very short time. Otherwise it’s just going to hang around as a slightly nicer but niche alternative.

    • HaraldvonBlauzahn@feddit.orgOP
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      20 minutes ago

      Just one thought: jujutsu enhances on the cases where there are several people collaborating which do not only need to store, transfer, and distribute source code, but which also need to read and understand changes because they are working collaboratively on complex stuff. It makes it easier and much quicker to bring the change history into a logical and concise form. For some people/orgs, this improvement might not be relevant.

    • HaraldvonBlauzahn@feddit.orgOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      4
      ·
      edit-2
      2 hours ago
      • Storing large files. LFS is a shitty hack that barely works.

      Well, git is for source control, not binary artefacts. There are indeed projects whose size is not a good match to git, but not everyone is Google or CERN.

      • Integrating other repos. Git submodules are a buggy hack, and Git subtree is… better… but still a hack that adds its own flaws.

      What are your requirements? What do you need this for? And why do you think everyone else needs the same?

      It’s quite possible you are doing it wrong. What you want as a FOSS project are probably libraries which are build, versioned, and packaged separately. Perhaps using Debian packaging tools or Guix. Splitting it into real libraries with a concise API ensures that the API surface does not becomes too large, that the components stay relatively compact and maintainable, and that other parts of the FOSS community can re-use that library.

      Companies - especially large companies - sometimes promote vendoring instead. But this promotes their interests, not those of the FOSS community on which creations they are building on.

      Yes, git is designed to match the needs of the Open Source community! If you have a deeply intertwined multi-billion code base for a commercial product, a smartphone with closed firmware, or yet another TV , it might not be the best match. But who cares? Is the open source community obliged to meet such needs?

      • FizzyOrange@programming.dev
        link
        fedilink
        arrow-up
        12
        arrow-down
        1
        ·
        9 hours ago

        Well, git is for source control, not binary artefacts

        Only because it is bad at binary artefacts. There’s no fundamental reason you shouldn’t be able to put them in version control.

        It’s not much of an argument to say “VCSes shouldn’t be able to store binaries because they aren’t good at it”.

        What are your requirements? What do you need this for?

        Typically there’s a third or first party project that I want to use in my project. Sometimes I want to be able to modify it too (soft fork).

        And why do you think everyone else needs the same?

        Because I’ve worked in at least 3 companies who want to do this. Nobody had a good solution. I’ve talked to colleagues that also worked in other companies that wanted this. Often they come up with their own hacky solutions (git subtree, git subrepo, Google’s repo, etc. etc. - there are at least half a dozen of these tools).

        It’s quite possible you are doing it wrong.

        No offence, but your instinctive defence of Git and your instant leap to “you’re holding it wrong” are a pretty dead giveaway that you haven’t stopped to think about how it could be better.

        • HaraldvonBlauzahn@feddit.orgOP
          link
          fedilink
          arrow-up
          3
          ·
          3 hours ago

          Only because it is bad at binary artefacts. There’s no fundamental reason you shouldn’t be able to put them in version control.

          There is a fundamental reason: You can’t merge them.

        • HaraldvonBlauzahn@feddit.orgOP
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          3 hours ago

          Because I’ve worked in at least 3 companies who want to do this. Nobody had a good solution

          There are good solutions: Use proper package managers with automated build support like dpkg, pacman, pip or perhaps uv, or even better Guix. Companies not doing that are just cutting corners here.

        • ysjet@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 hours ago

          Just to jump in here, git submodules and similar are a terrible design pattern that needs killed, not expanded. Create a library properly and stop cutting corners that will bite you in the ass.

          Three seperate companies wanting to do it the lazy, wrong way doesn’t suddenly make it a good idea.

        • Flipper@feddit.org
          link
          fedilink
          arrow-up
          2
          ·
          8 hours ago

          Git was Made for the Linux kernel. The kernel is pretty much only text files. For the complete decentralisation git achieves an easy diffing and merging operations needs to be defined. It is working for what it was made.

          Large files don’t work with git, as it always stores the whole history on your drive.

          For files that are large and not mergeable SVN works better and that is fine. You need constant online connectivity as a trade of though.

          Some build tools for software being the option to define a dependency as a git path +commit or a local path. That works quite well but is in the end just a workaround.

          • HaraldvonBlauzahn@feddit.orgOP
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            2 hours ago

            For files that are large and not mergeable SVN works better and that is fine.

            This. I have worked for a large research organization where a single SVN checkout took more than 24 hours. And they knew what they were doing.

            BTW jujutsu, being created by an engineer who happens to work at Google, supports alternative backends which are meant for very large repos. But as said, I think that these do not align with the needs of the FOSS community.

  • footfaults@lemmygrad.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 hours ago

    While git’s CLI has always been atrocious (I’ve been using it daily since 2011) I’m used to it by now. So what is Jujutsu really bringing to the table here?

    Support for multiple version control backends? Why? Everyone has pretty much settled on Git, for better or worse. I say this as someone who even used Git-TFS, Git-SVN, as well as Git-CVS import tools to pull things into Git.