Generative AI has taken the world by storm. One can barely find any tool that does not have LLM incorporated somehow (whether it makes sense or not). But these are still very early days for this kind of tech (despite "Attention Is All You Need" being published over six years ago), and (relatively) very few people have practical, hands-on experience building Gen AI-powered apps.

Nevertheless, the public Internet (esp. LinkedIn) is already full of self-proclaimed AI experts (who, by coincidence, were tagging themselves as Web3 or Metaverse experts only half a year ago ...) who flood the Internet with dubious advice & questionable-quality tutorials. But hey, this is what recommendations & reviews are all for. Let's use the power of social networking to spread the news about attention-worthy educational Gen AI-related content (so all the other crap may rot in obscurity).


The course I'd like to cover today is "Generative AI with Large Language Models", but first, I need to make a few disclaimers (for the sake of transparency):

  • The course is co-produced by my employer (AWS), and three of the trainers are actually from AWS (I don't know them personally, though)
  • I have paid for the course with my own money & gone through it in my private time.
  • It was my own initiative: I was not told to participate in the course or write this review (which, by the way, also has happened in my private time)
  • The opinions are mine, private, and should not be associated with my employer (it shouldn't come as a surprise: this is still my personal blog, after all)

After the formalities, it's time for the actual review, so buckle up.


First of all, I've reached for this course because it's a (very recently announced) collaboration of AWS with DeepLearning.AI. I've done their three courses in the past (Prompt Engineering, LangChain, and ChatGPT API) & I was positively surprised with the content (esp. keeping in mind they were all free of charge ...).

"Generative AI ..." is actually not free (49 USD - the price may have been adjusted to my geo), but the breadth of topics covered & the course's formula have also been significantly expanded. The curriculum is split into three weeks & each of which has video explanations, quizzes, reading materials & (what is most important) - interactive labs (in Amazon SageMaker Studio). The cost of labs is already covered in the course cost - you don't need to pay for any resources on the top of that.

The scope

What is in the scope of the course? Well, let's make it clear:

  • It does not teach you ML, PyTorch, or Python from scratch - in fact, it tries to introduce as few dependencies as possible (appreciated!). Full focus on Gen AI.
  • There is a theoretical introduction to the transformers model, but it does not overwhelm you with mathematical details - you learn exactly as much as you need to proceed.
  • This is not a course about AWS AIML services - yes, they are used for labs; yes, there's a short (optional) video on SageMaker JumpStart (how to deploy a chosen LLM with that), but TBH, the course is as general & vendor-agnostic as it could have been.
  • The curriculum is shaped by the lifecycle of the Gen AI application - which makes a lot of sense & provides a nice, comprehensible structure to the course; it starts with topics like pre-training or prompting, then proceeds to fine-tune techniques (LoRA, soft prompts), and concludes with RLHF & architecture considerations of Gen AI apps in a wider context (e.g., in multi-model, agent-based architectures).

As you can see, it all makes sense. It helps you to build a general big picture - what are the typical building blocks (of Gen AI app), what are they used for, etc. It was all practical - I didn't feel I was wasting time on irrelevant details. Math-wise, there was one moment (I think it was a PPO video - the optional one) when formulas were hard to follow, but it didn't cause any trouble with understanding concepts from the subsequent videos.

The labs

Let's move to the most important part - labs.

The initial impression was stellar. Everything is auto-provisioned, instructions are crystal-clear, and it's all running in the cloud - so you don't need to install anything locally. What's even more important, they indeed do illustrate the particular techniques specific to the Gen AI apps - you don't end up asking yourself: "What purpose does that serve? Where is this going?". Clearly, a lot of effort was dedicated to preparing these labs, but ...

Yes, there's a "but".

I like to work with labs that actually teach me something by doing. The ones that provide context & stimulate me to solve problems in a simulated, safe environment. But the labs from this course do not do that - not at all! You don't write ANY code, but just run the code someone has put for you there. Then you compare the result with the description in the comment & move to the next paragraph. Rinse & repeat. It's like watching a video that auto-pauses every minute or so - no real interaction. In fact, there are recorded "walkthroughs" of the labs, which means you're going through that content twice.

Of course, you're free to experiment with the code in the lab, but ... are you actually equipped to do that? That's another issue - the tutors do very little to teach you the key building blocks on the code level: abstractions, mental models, and idiomatic patterns. They briefly describe the Python dependencies (libraries downloaded by pip), then introduce you to Model & Tokenizer groups of classes (and how to initialize them based on binary artifacts saved earlier) & ... that's pretty much it. Don't expect you'll be able to craft a similar application from scratch after the course - I wasn't.

I would expect some walkthrough across libraries like "transformers" or "datasets" (or maybe even the whole HuggingFace open model ecosystem) to make sure the audience understands what's in there, how to use that, and how all the parts fit together in general, not only in this particular lab exercise. E.g., GenerationConfig is mentioned very briefly, but aren't you supposed to know its capabilities to build anything new on your own?

So, to summarize my impression of labs - they are good ONLY IF you expect an illustration of the concepts covered in videos: "That is how XYZ can look in code, but don't focus on details, only on the outcome displayed in line ABC.". If you (like myself) are a builder and want to learn how to develop Gen AI apps HANDS-ON - these labs are far from enough (IMHO).

The difficulty

How high is it the difficulty bar? And I mean both comprehending the materials & passing all the tests/assessments (to obtain a final certificate of completion). Frankly, it's not high. Not because the content is so trivial - the trainers do good work when it comes to explaining the concepts.

I would not call the course particularly "intense". I've done all the 3 weeks in less than 1 week & I definitely didn't focus my afternoons only on that. It's probably a deliberate maneuver to increase the potential audience w/o discouraging anyone.

Speaking about assessments, there's not much testing going on here:

  • There's a quiz after each "week" - approx. 12 questions (checkboxes/radio buttons); 80% pass grade, but you can try indefinitely - the first two I did on the first try, and the final one required a single re-attempt.
  • The quizzes are not bad; many questions test relevant, applicable knowledge), but there are also some (minority) that verify if you've memorized some meaningless wording; there are no questions about code constructs, specifics of models, or other purely technical details.
  • Labs don't verify any skill, except clicking the "Submit" button.
  • There's no project/assessment ("homework") you'd have to do on your own. YES, I realize it'd not be trivial to design such an activity & (especially) a way to validate it automatically.

Was it really worth it?

Hmm, in the end - I think so. The course brings you a very good overview of the conceptual model & the lifecycle of Gen AI apps. You can also see these concepts turned into code - however, I would prefer it to be more interactive. But still - it's nice not only to hear about LoRa but also to see it applied in a working example.

The course is very well delivered (technically), not overpriced, and widely applicable to anyone interested in Gen AI - not just folks who know only the AWS cloud ecosystem. IMHO, it makes sense to treat it like a decent intro into courses hosted on Hugging Face, e.g., their NLP course., which definitely fills the "hands-on" gap (but also repeats some of the basic content!).

Share this post