Tux@lemmy.worldM to Technology Memes@lemmy.worldEnglish · 1 month agoSoftware: Then vs Nowlemmy.worldimagemessage-square5fedilinkarrow-up137arrow-down111cross-posted to: memes@lemmy.ml
arrow-up126arrow-down1imageSoftware: Then vs Nowlemmy.worldTux@lemmy.worldM to Technology Memes@lemmy.worldEnglish · 1 month agomessage-square5fedilinkcross-posted to: memes@lemmy.ml
minus-squareMako_Bunny@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up7·1 month agoWhat Python code runs on a graphics card?
minus-squareapfelwoiSchoppen@lemmy.worldlinkfedilinkEnglisharrow-up15·1 month agoPhyton, not Python. 🙃
minus-squareBougieBirdie@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up3·1 month agoPython has a ton of machine learning libraries. I’d maybe even go so far as to say it’s the de facto standard when developing AI There’s also some cuda libraries which by definition do things directly on the card
minus-squareTarogar@feddit.orglinkfedilinkEnglisharrow-up2·1 month agoYes… It’s possible to have that. Even when it doesn’t do that by default. The CPU can and still is the bottleneck in a fair few cases and you bet you can run shitty code on there.
What Python code runs on a graphics card?
Phyton, not Python. 🙃
Python has a ton of machine learning libraries. I’d maybe even go so far as to say it’s the de facto standard when developing AI
There’s also some cuda libraries which by definition do things directly on the card
Yes… It’s possible to have that. Even when it doesn’t do that by default. The CPU can and still is the bottleneck in a fair few cases and you bet you can run shitty code on there.