grimalkina,
@grimalkina@mastodon.social avatar

Something I'm curious about rn is how departments that work WITH eng but aren't classified in engineering are being barred from tools like Copilot. We saw this lag with data science and I bet there are swaths of people whose careers will be impacted by this now. Much of the time when I talk to software engineers at large companies they are shocked to learn scientists often have to fight company policy to use the same tools that we might often need. Not something "devex" seems to care about lol

shrmanator,
@shrmanator@mstdn.ca avatar

@grimalkina I hope it's because they are being cheap and not because of some BS gatekeeping... but probably not 😔

grimalkina,
@grimalkina@mastodon.social avatar

@shrmanator I think the choices we make about how resources are allocated are often justified by beliefs we hold about who is deserving

nf3xn,
@nf3xn@mastodon.social avatar

deleted_by_author

  • Loading...
  • grimalkina,
    @grimalkina@mastodon.social avatar

    @nf3xn very "good guy/bad guy" framing here...I was frequently the only person who actually cared about ethics with human behavior data and had training in it as opposed to the eng teams I worked with.

    nf3xn,
    @nf3xn@mastodon.social avatar

    deleted_by_author

  • Loading...
  • grimalkina,
    @grimalkina@mastodon.social avatar

    @nf3xn I'm not advocating for less diligence?? It's interesting that you're reading a comment about systematic organizational decision making that way. Sounds like an argument you want to have somewhere but not sure why it's with me here.

    joey,
    @joey@mathstodon.xyz avatar

    @grimalkina
    Counterpoint: Copilot is probably dangerous for non-developers who a) won't notice when it hallucinates something incorrect and b) aren't trained in how to properly test code.

    That said, copilot as is, is pretty dangerous even with most developers, so...

    grimalkina,
    @grimalkina@mastodon.social avatar

    @joey hmm, for non developers?! People with better stats training than developers like quant researchers and scientists are the ones who actually work with false prediction and signal problems more. I'd be careful about being essentialist about people here. It's this kind of stereotype that has cost me a lot in my career.

    joey,
    @joey@mathstodon.xyz avatar

    @grimalkina
    That's fair, and I shouldn't further stigmatize non-developer experts.

    I think what I really mean is that Copilot is a poor way for someone who is unfamiliar with a tool to learn that tool. Even for a domain expert, it's possible for code to look correct, i.e. to match an equation or formula or algorithm from a domain's theory, but to be wrong because of subtle implementation issues with the programming language.

    I say this as someone who spends a lot of time thinking about the way code can be wrong and the way programming languages do or don't help people see when code is wrong.

    And it turns out they all have rough corners and tricky details that need to be learned, and an LLM that doesn't actually do any reasoning is not a good way to learn those rough corners, especially if lots of the code in the wild that the model was trained on contained those common mistakes.

    LLMs turn the hard problem of writing code into the harder problems of reading code and checking if it's correct.

    grimalkina,
    @grimalkina@mastodon.social avatar

    @joey I definitely share those concerns. I work a lot on the benefits of learning & skill building cultures on eng teams, and I worry deeply about these approaches being used as excuses to further cut investments in learning, mentorship, etc.

    However, I'm also really sensitive to issues of access and velocity here. My wife teaches students to code, and many of them want to be scientists, not engineers. It would disadvantage them to tell them they cannot access these tools when so many peers do

    grimalkina,
    @grimalkina@mastodon.social avatar

    @joey I agree also with the need to understand why we are making an inference and why we are scaling up a pattern of some kind. On the other hand it's frustrating when all you want to do is make some stupid connection work, unblock something that you KNOW is just syntax you were never taught, to let you unlock the rest of your skills. It's that dialogic work that I think matters for a lot of us outside of eng, not the desire to suddenly generate lots of code

    mensrea,
    @mensrea@freeradical.zone avatar

    @grimalkina the more time i spend looking into these third party generative media functions the more i feel they should be banned for everybody in enterprise. if you've trained your own modal, on your own curated data set, go for it @joey

    grimalkina,
    @grimalkina@mastodon.social avatar

    @mensrea @joey well, I'm interested in understanding what is actually happening in the world and how people reason about it and how it impacts them, because I'm a social scientist.

    joey,
    @joey@mathstodon.xyz avatar

    @grimalkina
    That's fair, I guess what I'm saying is that what's happening in the world is bad and dangerous and clouded in hype, and it should absolutely be studied, but I would not advocate having more people partake in it given the status quo.

    It's with mentioning that smart people like @TaliaRinger are working on ways to do things like Copilot that are safe and that don't hallucinate garbage code and that can reason. But the current incarnations of the tools scare me.
    @mensrea

    grimalkina,
    @grimalkina@mastodon.social avatar

    @joey @TaliaRinger @mensrea the monopolies, lack of data ownership and privacy, and giving this over to a few central actors is definitely not what I think is either healthy or going to yield quality. I just think it's really tough that this is also happening at a time when we have such urgent need to process a lot of data and build so much and get a lot of work done that so many have wanted to do but been cut off from. I'm not scared of stats or automation, just these power economies tbh

    grimalkina,
    @grimalkina@mastodon.social avatar

    Equity and equity in "permissions" is actually a fascinating question re: internal company data and tools both. I remember when I started in tech once I spent months unable to access data I was responsible for analyzing and it was just a random eng decision that hadn't accounted for a role like mine and no one on my side even was able to view the permissions so it took us months to fix.

    douglascodes,
    @douglascodes@mastodon.social avatar

    @grimalkina
    I was working as a contractor in fintech and after a week of working with a manager trying to debug something it turned out the problem was that I wasn't in the "Everyone" group.

    paninid,
    @paninid@mastodon.world avatar

    @grimalkina this is and an unappreciated aspect of enterprise product culture, particularly undocumented and institutional non-processes for permissions

    grimalkina,
    @grimalkina@mastodon.social avatar

    @paninid definitely!! I almost did a research project on this a while back that I still think is a good one

    agvbergin,
    @agvbergin@mastodon.me.uk avatar

    @grimalkina Very much this. I used to be on a team where the one man on the three person team used to accumulate permissions that we were officially denied to us because he was friendly with the all-male sys admin team.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • kavyap
  • mdbf
  • osvaldo12
  • ethstaker
  • tacticalgear
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • modclub
  • Youngstown
  • everett
  • slotface
  • rosin
  • GTA5RPClips
  • provamag3
  • khanakhh
  • cisconetworking
  • tester
  • ngwrru68w68
  • normalnudes
  • Durango
  • InstantRegret
  • cubers
  • megavids
  • Leos
  • anitta
  • JUstTest
  • lostlight
  • All magazines