I wonder to what extent there is anyone who could cope with the Oxford course but who could not breeze through A-level Further Maths. I entirely understand that the converse is false: there are people who are extremely good at exams but who simply couldn't cope with the Oxford course.<p>Almost all of those on the Cambridge maths course (which I'm taking to be similar) found A-level maths/further maths very easy. Of course, the admissions process selected for such people, but I think "finds A-level maths content very easy" is a prerequisite for the course.
This is just speculation but I have some ideas of what effects might be in play here.<p>First the exams test knowledge, not problem solving ability, which is what he wants. It <i>may</i> be possible to design a test for problem solving ability. That's basically what IQ tests are designed to do. As I understand it, the SAT test used to be like an IQ test and highly correlated with IQ.<p>Second exams probably do correlate with IQ (it correlates with literally everything), but correlation isn't enough. You can't just take the top n results from a test. You can use a test as a filter, e.g. sampling the top 5% of results or whatever. But when you set the filter too high, you just get outliers.<p>E.g. the top 100 people who are freakishly good at taking tests, and nothing else. Maybe they have abnormally good memories and just remember everything they read in the textbook. Or maybe they have parents that pushed them to study excessively, or just did so on their own.<p>You only should use tests as a filter, a minimum standard. Not optimize for it directly.
From speaking to people that have gone to Oxford and Cambridge, they are definitely not necessarily letting in just the brightest minds. Nor do I think they truly are trying to find them - there is a definite "fit" for these places. If I had to characterise it, I'd say "smart, and very academic/scholarly". They all seem to have a deep appreciation of some subject or topic and it seems like their admissions process is very good at uncovering it.<p>The other thing everyone from there talks (whines) about is the sheer volume of work they are expected to complete, and for that, exams seem like a good (albeit imperfect) proxy for gauging this.
I didn't do the A-levels but from my understanding of them I have always thought that they don't seem to discriminate between candidates well enough. That is to say that there just are not enough grades. I think the passing grades you can get are A,B,C or D. Most people who get in to the top universities get straight A's. I think if each of these grades was split in 3 it would make the result a lot more meaningful and would go a long way towards ensuring that the best candidates got invited to study at the university (I think the entire interview process is fraught with huge bias problems).
This article speaks directly to the problem that Art of Problem Solving is trying to address. <a href="https://www.artofproblemsolving.com/" rel="nofollow">https://www.artofproblemsolving.com/</a>
Richard Ruszcyk wants to teach kids to think, how to solve novel problems, and hates the normal approach of teaching rote formulas. No student steeped in the AoPS way would crash during the Oxford interviews. An Oxford interview is their "middle school normal".<p>The AoPS text books are the best math textbooks I've seen anywhere. The online classes are great, but the pace is blistering. But most kids should use the AoPS books, just at a more suitable pace. (And by blistering, I mean my daughter was very seriously challenged to keep up with her peers all the way through single variable calculus, where AoPS tops out. She then enrolled in multi-variable calculus at a local engineering university at age 13 and blew away the curve.)<p>AoPS is an example of where learning can go. AoPS teaches how to think, not how to takes tests.
Ed Frenkel told a great story about how Soviet Russian oral exams were used to discriminate against Jews by giving them harder questions and demanding more sophisticated answers.<p><a href="http://www.npr.org/2014/03/28/295789948/the-real-problem" rel="nofollow">http://www.npr.org/2014/03/28/295789948/the-real-problem</a>
I was curious to see the plot of log(log x).<p>It's weird. Fortunately we have Wolfram Alpha so it was trivial to quickly see it: <a href="http://www.wolframalpha.com/input/?i=plot+log(log+x)" rel="nofollow">http://www.wolframalpha.com/input/?i=plot+log(log+x)</a>
<i>The rise of continuous assessment models and the ability to track students’ every interaction with digital learning content may allow for broader, more holistic evaluations of student potential.</i><p>This is merely a complex way of saying "metrics". Welcome to the real world Oxford kiddies.