Laptop Advice for Data Science (AI & ML Specialisation)

I'm looking at doing a Masters and want some advice re Macbook Air's vs Pro's, as I want a more portable laptop to be able to do study while on the train to work (90min commute each way).

I have a Legion 5 pro Ryzen 6800, 16GB RAM (considering upgrading), 3060GPU but its far too heavy to take to lug around. Should I just get a MacBook Air or would it be worth spending a bit more for a Pro, as the faffing around between devices could be a pain? I've seen some deals recently on the M3 Pro's but I'm not sure if they're worth it for my use-case. Or will the extra be wasted as most of the heavy compute will be done via cloud based solutions?

Comments

  • Budget?

    • -2

      I thought they just hired cars.

      Have they diversified to laptops ?

      • They tried to be more direct and accidentally diversified into insurance

    • Probably should have added that. $2.5k, $3k if there's justification. Happy to wait on clearance models if need be.

      • For that budget I'd be answering the following:
        1. How many personal projects will I be doing?
        2. How much do I care about weight?

        If you plan on doing a lot of large personal projects then I'd say max your budget and get as much ram as possible and go for a Max or Pro chip. If you are mostly going to be tinkering on the laptop then go smaller. You will likely ssh into a centralised system for the uni work.

        But if I were you I'd let your answer to question 2 guide you. Lugging around a heavy laptop gets old quick and the performance hit you get with MBA is well worth it for most people. You have to ask yourself if you're buying the laptop the 99% use case or the 1% use case.

        EDIT: I didn't explicitly say it but there I would recommend you stick to Apple here. Apple Silicon is the most cost effective for performance and the unified memory means you dont have to pay twice for VRAM and RAM. The build quality is also second to none.

        Something like this would probably fit your bill. It will be able to run some small local LLMs be a breeze for any data analysis tasks even with large datasets.

        • Thanks for the detailed response, it really helps. Getting stuck on the 1%ers gets a bit much particularly when you start going down the Macbook Pro customisation options.

          I was already leaning towards a MacBook mainly to get some exposure, as I've always been a windows guy, and all the AI/ML people at work have MacBook's. This just solidifies my decision, and will aim for something similar to the one you linked.

  • +1

    Mac Pro definitely.

    • 🤣

  • +5

    If you are going to run local LLMs then you need as much ram and you as you can get - so a MacBook pro with as much as you can afford.
    If you are just using a lightweight computer for in class note taking, then no need for a pro.

    Surely if you are about to do a post-grad in this stuff you should be telling us?

    • +3

      "Surely if you are about to do a post-grad in this stuff you should be telling us?" - Fair statement. I'm asking as I'm skipping the undergrad by using my work experience, which has been Analytics and Pega Decisioning based. I know enough to know that I don't know enough.

      I can't use my workplace as an example because they have more money than sense. I have my thoughts but it doesn't hurt to speak others who may have more specific experience.

  • +5

    Highly recommend to use your budget towards a cheap light laptop and decent workstation/desktop that you can remote into.
    No worries about battery life and you will definitely get better hardware for the money.

  • +1

    Or will the extra be wasted as most of the heavy compute will be done via cloud based solutions?

    Exactly this. Any modelling you'll be expected to run locally won't be intensive. Get a mid-spec laptop, something that you can comfortably lug around.

    • +1

      Last year I built a RAG with embedding and reranking. Price-wise it's relatively expensive because it uses a lot of tokens (millions) but it was actually not very GPU computationally heavy. I was able to run it on my gaming PC instead and use and connect to chatgpt via API only for the bits it was heavy usage.

      OCR is similar, there are some extremely lightweight models that are easy to run locally, pushing it through an API requires a paid account with most services (Gemini still has a free tier on their API though).

      • Out of curiosity, what were the specs of your PC?

        • +1

          Just a 3080 (plus a 5700X and a lotta RAM). I was only running a 7B model (it was more than enough for the project).

          I planned on buying a 16GB-24GB card and increasing the model size, but I gave up on it in the end. Shame that deal at the moment of a 16GB 5060ti at mwave looks like a headache waiting to happen, I'd jump on it!

    • Thanks mate. Appreciate the input.

Login or Join to leave a comment