Preparing to teach a class where you "don't know the material" is tricky, especially when the thing is so big and so vast that you can never truly be an "expert" in it. If you think about expertise as knowing all the things, that is. If you think of expertise as the ability to be productively lost, that changes the entire game -- the trick is how to help your students get through the same sort of territory.

I don't have an overarching framework of strategies, but we do have a few things to share from the Teaching Open Source world. I wish we had a better write-up of our overall philosophy, but it wasn't sufficiently developed (that's something I'd love to work on with others... later... after... thesis...) but you can infer a bunch from some of our artifacts, so here you go, in order of least interesting to most interesting in my heavily biased opinion.

1. We have generic project-helpful activities.

We have a number of learning activites that (1) are highly likely to be useful for students regardless of what project topic they are on and (2) have clear criteria for successful completion, aka they are assessable. Analogous college-type things might be stuff like "make a test plan for your object" or "determine the appropriate formulae for predicting the behavior of your material" (I'm obviously making these examples up in a domain -- materials science -- I don't know much about).

2. We have specific tool walkthroughs.

We also have activities that are walkthroughs of specific skills/tools common to most projects, often on a known setup (in the case below, a dummy server). This would be things like "do this intro exercise to use the Instron for the first time with our pre-cut samples."

3. We have activities about critiquing the work of others (not other students in the class -- this is not about peer assessment).

Moving into more interesting stuff: we also have activities that are about looking at other people's work -- not making new things, but critiquing existing things, to start developing a sense of how experts see things. In college, that might be: "Look at these pages from 3 lab journals, compare/critique; what makes the good ones good, the bad ones suboptimally helpful?" The reason it's not peer critique is that you want to curate the examples to be rich and to have a range of things you can pull out in the discussion. (For those of you with qualitative research backgrounds, you can think of this as artifact analysis.)

4. We have resources (not just live demos) that lay out expert thinking.

We have think-alouds, where (more) experienced people demo their thought process being "productively lost" and then unpack it to newcomers, so they can start comparing metacognitive strategies. Every time you think out loud to students with their project, you're doing this -- but sometimes making artifacts is helpful, too. Also, accessibility is super-important for this... if you're making videos, caption them. If you're including images, describe them, and so forth. (My images are screenshots of the linked webpages, so I didn't put image descriptions.)

5. We frame their mindset explicitly.

We have documents that explain the state of mind / viewpoint / psychological priming we want students to take towards things.