I am looking to replace my old PC, and wondering what other people use.

Do you use your own hardware? If so, what do you have? What do you think gives you the most bang for your buck at the moment?

Do you use the cloud instead? If so, why? Which service(s) do you use?

Thank you!

    • joelthelion@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 years ago

      Interesting! How do you use it? Do you connect it to your main PC? How?

      Also, what RAM does it use? Does it use the main system RAM?

  • __forward__@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 years ago

    I think major training should just be done on dedicated servers/on the cloud. That being said it is very helpful to test locally, so in case you are planning on using Nvidia equipped servers just get any somewhat recent consumer Nvidia card and you can always run locally on some sample data and test much more easily.

    • konodas@feddit.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      I second that. Being able to test medium sized models locally can make debugging much easier.

      I have a 3070 with 8GB VRAM, which can train e.g. a GPT2 with a batch-size of 1 with full precision.

  • radical_action@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    I would suggest just getting a laptop and nice external interface (keyboard/mouse if you prefer, nice monitor) + remote server. I bought a desktop + gpu setup back when I started my masters, but I use it shockingly little for work. The type of work that a single gpu + local machine incetivize are usually against good scientific and experimental practice. You dont really want that running jobs during the day.

    As for specific cloud reccomendations, I have none. I just use what is available at my institution.

    • joba2ca@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      Depends on the use cases I guess. If any larger scale deep learning is going on, you cannot afford buying all the required GPUs anyways.

      However, I found myself using my tower PC quite a lot during my Masters. Especially for Uni projects my GPU came in very handy and was much appreciated by group members. Having your own GPU was often more convenient than using the resources provided by the lab.

      Also, while relying mostly on cloud resources in my last job, I would have found having a GPU available on my work machine very convenient at certain times. Very nice for EDA and playing with models during the early phase of a project.

      Besides from that, IMO a good CPU and > 32GB RAM on your own machine are sufficient for EDA and related things while I would rely on cloud resources for everything else, e.g., model training and large scale analyses.