🎉Announcement: The AI Assistant and Skills Mapping and AI-powered learning paths are live!🎉

RyanJaress
RyanJaress Posts: 336 Udemy rank
edited November 1 in Community News

Hey Instructors,

Today we’re thrilled to share that we’ve officially announced Udemy’s newest AI features: Udemy AI Assistant, Skills Mapping and AI-powered learning paths. These transformative features are now accessible to learners and leaders  within organizations, providing them a way to offer more personalized learning programs at scale.

For more information, check out the full press release and blog post.

Thank you to all of the instructors who have contributed to all of the ongoing product discovery work. As we continue to innovate, we will remain true to our mission, now enhanced by the power of AI, with humans at the heart.

Teach on,

The Udemy Instructor Team


«1

Comments

  • I agree with @ScottDuffy. These kinds of announcements may be good for getting investors excited and raising the stock price, but they do very little for us instructors.

    I feel totally disconnected from how these features are supposed to work as so far we have only seen mock-ups or screenshots. We should get the chance to be one of the first ones to try this out, report issues, and help you build a better product.

  • FrankKane
    FrankKane Posts: 1,874 rolemodel rank
    edited October 30

    I noticed that the AI Assistant was described in today's earnings call as featuring summaries of courses and lectures.

    Is that the assistant's primary purpose?

  • Veasna, M.
    Veasna, M. Posts: 179 storyteller rank

    Useless!

  • MarinaT
    MarinaT Posts: 2,148 Udemy rank
    edited November 1

    Hi @FrankKane,

    Great question! While summarizing content is one of the AI assistant's features, it also leverages its understanding of Udemy instructor content to recommend relevant courses, explain complex concepts, and more.

  • FrankKane
    FrankKane Posts: 1,874 rolemodel rank
    edited November 1

    I went back and reviewed the GenAI Product Preview demo and slides, and didn't see any mention of course and lecture summaries. It was toward the end of the FAQ only:

    When will the AI Assistant’s “summarize course/lecture” feature show up? How will it impact course watch time and instructor revenue?

    Summarization of a course or a lecture will only show up if a learner  asks for a summary during  content discovery. For in-course, there will be a prompt that learners  can click to summarize the lecture or course. These summaries are intended to be brief and act like the Course Landing Page to drive learners to consume more lectures, rather than to replace the lecture.

    I feel like this feature was downplayed to us. If it is widely used by learners to quickly skim through courses in a way that doesn't credit us with minutes and moves consumption to a place with zero revshare, then instructors who did not opt out will find themselves at a disadvantage. Is it still the case that these summaries will be too brief and unspecific to be used as a replacement for the actual content? Can UB learners mark a lesson as viewed if all they did was read the summary? It would be really helpful to see it in action to allay our fears.

    Also, the opt-out terms said they would go into effect at the end of 2024. Have AI features been deployed into production using data from opted-out instructors?

  • RyanJaress
    RyanJaress Posts: 336 Udemy rank
    edited November 5

    Hey there @FrankKane

    Thank you for your feedback! We value your input and understand your concerns. To clarify, the AI summarizing feature is designed to enhance, and not to replace, the learning experience.   It offers students a brief and high level overview of each lecture, encouraging deeper engagement with the course content. If students are looking for specific information, the AI assistant can direct them to the relevant video, thereby enhancing overall course engagement. A lesson is not marked as viewed just by reading the summary, as our goal is to maintain the integrity of the learning experience and ensure that instructors are credited appropriately.

    We want to emphasize that the summarizing tool is designed to help our busy learners quickly and conveniently locate the specific video content they need, ultimately boosting video engagement—and, in turn, increasing revenue for instructors. 

    Also for further clarification, we won't use opted-out instructor content. We honor instructor opt-outs as outlined in Section 6 of our Instructor GenAI Policy.

  • FrankKane
    FrankKane Posts: 1,874 rolemodel rank

    Thank you for the response, @RyanJaress

  • I donot understand about the purpose of udemy AI, what is in it for me?

  • ScottDuffy
    ScottDuffy Posts: 897 rolemodel rank

    @RyanJaress Let's see it :)

    Create a video using the summarizing feature a few different ways. Create a video using the search feature a few different ways. Let's see it in action.

    cc: @GenefaMurph976

  • Congrats to the Udemy team for rolling this out - I think it could be interesting to share with instructors some analytics on how this gen AI feature will be used. That is, if someone is using the tool to consolidate a "data engineer" role into specific skills, such as "data remediation", "data profiling", etc, for example, and my course is recommended as part of the composite learning path, then those data bits could be saved, so that insights can be provided for instructors, and I know, in my case, that my courses are recommended by the gen AI engine for "data engineer" roles or for "data profiling" skills in specific, for example. On an aggregate level, could be interesting. Different courses could be recommended for different skills or keywords within the same role or purpose.

    We already have some basic analytics on students, regions, skills, etc. But this gen AI engine seems to go much deeper, and making available analytics on those super-specific analytics could help instructors, too. Right now I feel "blind" as to how the tool is used, who uses it, for what roles, and for what skills. Would be good to at least have a few pointers. Eg top 5 roles gen AI recommends my courses for. Top 5 skills gen AI recommends my courses for. Etc.

    Because, let's face it, from the moment this tool was announced, with its potential to make or break revenue streams, most, if not all instructors will be optimizing their future courses to be indexed by the tool (creating standalone modules, having modules for specific skills, not referencing other modules in the same course, etc). Knowing what skills/roles our courses are recommended for could help guide this creation further IMHO!

    Also, @FrankKane, I think the summarization feature will, in fact, cause instructors to take a small hit. Sure, as Ryan said, reading the summary itself won't mark a lesson as watched, but someone can still… read the summary… and then mark the lesson as watched anyway. You can mark any lesson as watched without watching it in the present (which is why the feature exists, I guess), so some people will in fact read the summary and mark the lesson as watched. It's one of those things where, just due to the fact that this feature exists, you know some people will use it that way. It's just how people are. I do think most people that are watching a course are there to, well, watch, so it won't be a big destabilizer, but there may be 1-2% of people that read summaries and skip out on several lessons, I guess. For example, if there's a course that is mostly on a topic, but has 10% tangential content on other areas that is not crucial - someone can just summarize it as text and skip that part. Would be interesting to see how this applies to courses that touch on various separate, yet important topics, such as masterclasses, and for courses that have an "essential" and a "bonus" part, for example.

  • @RyanJaress Thank you. It seems interesting. Hope it benefits all stake holders

  • Can we get a specific example, please ? How brief is it ? 50 words … 100 words ?

  • AHardin
    AHardin Posts: 578 visionary rank

    I have the Udemy Personal Plan and the AI Assistant Beta is enabled on my end, as a student. I have access to the course recommendations and in-course AI assistant. I did a quick test drive of both, and it's clear they're both still in beta. Here's some quick feedback based on some initial testing of them both:

    Course Recommendations AI Assistant

    1. If you ask it to find a certain category of course, it returns three results, telling you why they're a good fit based on your initial question. If you want to see more courses, you have to tell the chatbot to do so with another prompt.
    2. If you ask it about a topic, such as digital marketing, that you're interested in learning, it'll return a list of relevant course categories to explore.
    3. It doesn't have the ability to identify best-selling courses. I asked it to find best-sellers and it said it couldn't do that.
    4. It doesn't show course badges (best-seller, highest-rated, etc.) and sometimes recommends poorly rated courses over higher rated courses (it recommended a 2.9-start course as its top result in one test).
    5. Not all categories are indexed yet. I have the best-selling and highest rated poker courses on Udemy and it couldn't find a single poker course in the Udemy catalog.
    6. It presented me with an AI hallucination response of just "[ ]" while using it and sometimes won't show images and links associated with courses, just text.

    In Course AI Assistant

    1. It'll only answer questions related to the course you're taking, which is both good and bad. It's good because students can't ask completely unrelated questions to the chatbot. But it could be bad because if a student has a somewhat loosely related question, the chatbot may deem it falls too far outside the scope of the course and may choose not to answer it.
    2. It's so-so at summarizing lectures. I tested it in one of my courses and one response was "I'm not sure what to say." It seems to do okay with short lectures (2-3 minutes), but struggles with longer ones (10+ mins) that covers lots of sub-topics. The shorter lectures had good, concise summarizes, whereas the longer ones were missing a lot of information.
    3. The lecture summaries are short, so for those of you worrying it would replace your lectures - don't worry, it doesn't. They look like they tend to be 4-6 sentence high-level summaries.
    4. It looks like it may help decrease Q&A student questions, as it does seem to do a decent job answering student questions. I tested it with actual student Q&A questions and the answers seemed decent.

  • LawrenceMMiller
    LawrenceMMiller Posts: 2,301 rolemodel rank
    edited November 13

    I went into two of my own courses, went to the curriculum, clicked on the Preview button in the upper right corner, then chose a lecture and the AI assistant appears to the right, along with the Course content. I tried it with two of my courses and asked questions. The answers were drawn from my course and the answers were consistent with the definitions and use of language in my course. For example in my course on Consultative Selling, I simply asked "what is consultative selling?" The right answer could only have been drawn from my course. The answer was perfect. In fact I wish I had included such a concise, yet complete, explanation in the course myself.

    I had been under the impression that the Assistant would refer you to different lectures or different courses. It doesn't do that, at least from my experiments. It just creates an answer from the information and language within your course. I think that is very good.

    So far, from what I have seen with my own courses, I give the AI Assistant and A Plus!

    I think that students will become accustomed to using the AI Assistant and if your course is opted out I am afraid you may be at a serious disadvantage.

  • MarinaT
    MarinaT Posts: 2,148 Udemy rank
    edited November 13

    Hi @AHardin and @LawrenceMMiller,

    Thank you for sharing your hands-on experience and feedback on our AI assistants! We're thrilled to hear that you like them (A+ from Lawrence 👏), and of course, we’ll forward your thoughts to our internal team 😊.

  • ScottDuffy
    ScottDuffy Posts: 897 rolemodel rank

    Good job, Udemy.

  • I went to one of my courses which has a few assessments based on the concepts taught in the course. I prompted the assistant - 'I am unable to understand assessment 7, can you please explain it?' The assistant gave very good hints about what concepts need to be revised to solve the assessment, without actually spelling out the solution. It also informed that if more specific clarification is required, it will provide that.

    If I were a student, I would definitely be happy!

  • MarinaT
    MarinaT Posts: 2,148 Udemy rank

    Yay! We are glad you like it, @ScottDuffy & @AnweshaSengupta!

  • I tried out the in course AI assistent on my own course, and I really like it. It does a good job at summarizing the content of my course and explaining it. Great job Udemy!

  • LawrenceMMiller
    LawrenceMMiller Posts: 2,301 rolemodel rank
    edited November 15

    I agree with @Jonas Schmedtman. While we all seem to "like" the AI Assistant. Whether it is good or bad for our business is an entirely different question. We get paid (UB) for minutes watched. If the student quickly goes to the AI Assistant and asks a question summarizing the topic of a lecture, they may choose to read that and then go on to the next lecture, spending little or no time watching the video. If I understand the system, this means that our engagement numbers may go down significantly. I think this is at the heart of Jonas' concern.

    This is an empirical question. Since this month is the first month with the system being live, we should all be looking at our engagement numbers compared to prior months. There are always multiple factors effecting engagement, but so far it looks like engagement is down this month. Correlation or causation?

  • @LawrenceMMiller Yes engagement is at the heart of my concern. But besides that, I think this design choice just sends the wrong signal to students. It makes it look like the course lectures are less important than just chatting with an AI. I don't want that. I want my course content to be the main focus.

    As for revenue, I guess that everyone will be affected more or less in the same way, meaning that consumption may go down a lot, but the same for everyone. In that case, given that the total UB amount will stay roughly the same, $/min will simply increase for everyone, meaning that total revenue should not change for the individual instructor.

    One thing that I find frustrating, and that I don't see being talked about anywhere, is the fact that Udemy still hasn't announced revenue share for GenAI tools. I heard multiple times "when Udemy wins, instructors win". Well, in the teaching center it still says: "We’ll keep you informed of these changes as we finalize the details of this updated revenue sharing model.". Now that these AI tools are apparently online, where is the update? How will revenue be shared? Any update on this @MarinaT?

  • Yes, and text chatting with AI will hardly take the same time as watching the Video Lessons. Also, the answers are often very detailed, with content from the Course.

  • I want the Course content tap to be the default tap
    Or an option to Enable/Disable AI assistant tap <3

  • While I’m excited about the potential of GenAI, I’m currently underwhelmed by the AI Assistant. It performs well in some areas but fails noticeably in others.

    I feel the product might have benefited from more instructor input before launch. Slowing down the rollout could allow for critical fixes before students begin to see it as less useful, potentially harming its reputation (which helps no one). Below is my constructive feedback:

    I. UI/UX Issues and Suggestions

    1. Course Outline Visibility
      • The course outline is no longer visible when watching the course (moving to the next lecture). I suggest reactivating and focusing the course outline when transitioning lectures, while allowing students to switch to the AI Assistant as needed.
      • Feedback: This will make navigation more intuitive and reduce friction for students.
    2. Message Padding
      • Messages from the assistant have excessive left/right padding, resulting in overly long responses.
      • Suggestion: Reduce the padding to make messages more concise and visually accessible.
    3. Expand Button Visibility
      • The button to expand the AI Assistant is almost hidden, and many students might not notice it.
      • Suggestion: Introduce a quick onboarding guide or intro video for students, highlighting how to access and use the assistant or redesign the button altogether.
    4. Context Window Limit
      • Entering code often triggers an error: “Sorry, that message exceeds the 1500-character limit.” This is especially problematic for IT courses.
      • Feedback: Consider increasing the character limit for such use cases or offering a workaround for code-related queries.

    II. Functionality Issues and Suggestions

    1. Generic Responses for Summarization Prompts
      • Prompts like “summarize this course” or “summarize this lecture” return overly generic, practically meta-level responses with little use.
      • Suggestion: Either improve these summaries or reword/remove the prompt to set accurate expectations.
    2. Q&A Replacement Results
      • Non-IT Courses: Responses are generally good, pulling relevant information from course content and Q&A.
      • IT Courses: Responses are often too generic, overly broad, or fail to leverage existing Q&A/course material.
      • Feedback: IT-related responses should better utilize specific course content or Q&A data to ensure relevance.
    3. Long-Winded Answers
      • Some responses are unnecessarily verbose. Simple queries (e.g., “Is X a feature of tool Y?”) generate lengthy explanations with key points, examples, and summaries.
      • Suggestion: Optimize responses for brevity and clarity based on the complexity of the query.
    4. Predefined Prompts
      • Prompts often seem irrelevant to the current lecture (e.g., “What does the instructor cover in this course?” mid-course).
      • Sometimes the prompts don’t make sense. Example: Show example code for <NON IT TOPIC>
      • Suggestion: Predefined prompts should dynamically adapt to the context of the lecture or course section.
    5. Refusal to Provide Links
      • The assistant refuses to share links, even if they are provided as lecture resources. Additionally, lectures are not linked for direct navigation.
      • Suggestion: Allow the assistant to share pre-approved, course-related links and enable lecture linking for easier navigation.

    III. Examples of General Prompt Failures

    1. Prompt: “In which lecture do I learn about the concept Y?”
      Response: “I am not able to answer this question for now.”
      • Feedback: The assistant should know the contents of sections and lectures.
    2. Prompt: “In which lecture is Y explained?”
      Response: Discusses topic Y but doesn’t identify the relevant lecture.
      • Suggestion: The assistant should either pinpoint the lecture or indicate the information is unavailable (if that is really the case).
    3. Prompt: “Which section explains the installation?”
      Response: “I don’t have specific information on which section covers the installation.”
      • Feedback: The assistant should know the contents of sections and lectures.
    4. Prompt: “How do I apply the information from this course to my job?”
      Response: Dives into unrelated topics not covered by the course.
      • Feedback: Ensure responses align with the course content and context.

  • Effectively replacing the "Content Content" stripe with the AI nonsense is a real problem for me, since my "Course Setup" lectures mention (and show screenshots of) the "Resources" folders in tha stripe. Now, when someone is in the course, and looks at my screenshot, it differs. The student sees an AI stripe, while all my Course Setup lectures show a "Course Content" stripe.

  • Thanks for that detailed feedback, Valentin. I noticed several of those too, but did not mention them.

  • This is super good feedback, all - thank you for taking the time to provide it. Definitely taking it all in! Keep it coming!