Exam Grades
Comments
-
The drop in degree standards is because there are more degrees. The top degrees in credible subjects are just as hard and competitive as they ever were.TheBigBean said:For what it is worth, I don't really care with A-Levels - I knew mine were much easier at the time. I object to the drop in degree standards as mine wasn't easy.
The "degrees" that are what we would have considered to be entirely different types of qualifications, but in a fancy frock, or the playschool subjects cobbled together and padded out to make them a "degree" drag the entire system down.
You cannot have degrees taken at twice as many institutions and three times as many people, and expect the bottom half to be of the same standard.0 -
Even the top ones hand out much higher grades.First.Aspect said:
The drop in degree standards is because there are more degrees. The top degrees in credible subjects are just as hard and competitive as they ever were.TheBigBean said:For what it is worth, I don't really care with A-Levels - I knew mine were much easier at the time. I object to the drop in degree standards as mine wasn't easy.
The "degrees" that are what we would have considered to be entirely different types of qualifications, but in a fancy frock, or the playschool subjects cobbled together and padded out to make them a "degree" drag the entire system down.
You cannot have degrees taken at twice as many institutions and three times as many people, and expect the bottom half to be of the same standard.0 -
E.g. UCL has gone from 24% to 40% firsts in seven years.
https://www.bbc.co.uk/news/education-489516530 -
0
-
I think it's interesting that so much weight is put on the grade. In my field, because you need to show examples of your work in any job interview scenario, there is much less reliance on the grade, because much stronger evidence of ability is right there in front of you.1985 Mercian King of Mercia - work in progress (Hah! Who am I kidding?)
Pinnacle Monzonite
Part of the anti-growth coalition0 -
An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).1985 Mercian King of Mercia - work in progress (Hah! Who am I kidding?)
Pinnacle Monzonite
Part of the anti-growth coalition0 -
How do you evaluate graduates?rjsterry said:I think it's interesting that so much weight is put on the grade. In my field, because you need to show examples of your work in any job interview scenario, there is much less reliance on the grade, because much stronger evidence of ability is right there in front of you.
0 -
I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.rjsterry said:An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).0 -
If not Paul Johnson has spent the weekend reading it and has a great piece in today’s TimesTheBigBean said:
I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.rjsterry said:An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).0 -
"Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."TheBigBean said:
I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.rjsterry said:An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).
This is absolutely insane.0 -
I found maths A level at about the right level (maths with mechanics and pure maths from memory), personally. In the first year at university (Civil Engineering at Leeds), we had to do maths again - the pure stuff - and going back over all this was... annoying. It was like being taught how to read again.rjsterry said:That being said, what is the material impact? I know my brother found that the easier maths A level course meant he found his first year of Mechanical engineering at Imperial a bit heavy going but once he caught up it was a non issue.
But it soon became clear why it was done. Others who'd left other sixth forms or colleges with A or B (entry requirement) in A level maths were simply not at the same level as some. They didn't make it to 2nd year - difficult to study English if you can't read.
So some levelling up was required then (1998) and from all the discussion these past few weeks/months, I am not sure that has happened. We could have squeezed in a further engineering module rather than going back over calculus...Ben
Bikes: Donhou DSS4 Custom | Condor Italia RC | Gios Megalite | Dolan Preffisio | Giant Bowery '76
Instagram: https://www.instagram.com/ben_h_ppcc/
Flickr: https://www.flickr.com/photos/143173475@N05/0 -
Did they enlist Neil Ferguson for help with this modelling?rjsterry said:An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).0 -
Why? How do you predict who’s going to have a bad day? We had some we were pretty confident were going to get grade 4 (high performing grammar school so about the lowest we ever get) so we put those in. What if theirs get downgraded? We had a 3 in the year group 3 years ago so...kingstongraham said:
"Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."TheBigBean said:
I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.rjsterry said:An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).
This is absolutely insane.
0 -
That's the bit that is insane. "We know someone is going to have a bad day, so bad luck Emma, we've decided it would have been you.".johngti said:
Why? How do you predict who’s going to have a bad day? We had some we were pretty confident were going to get grade 4 (high performing grammar school so about the lowest we ever get) so we put those in. What if theirs get downgraded? We had a 3 in the year group 3 years ago so...kingstongraham said:
"Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."TheBigBean said:
I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.rjsterry said:An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).
This is absolutely insane.
This year's surely should have been awarded based on having a good day, and if that means grade inflation, that's the way it is.0 -
Garbage in = Garbage out (GIGO) is a well known computer science and mathematics conceptkingstongraham said:
"Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."TheBigBean said:
I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.rjsterry said:An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).
This is absolutely insane.0 -
It is, but I don't see why Ofqual have introduced a random garbage factor.coopster_the_1st said:
Garbage in = Garbage out (GIGO) is a well known computer science and mathematics conceptkingstongraham said:
"Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."TheBigBean said:
I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.rjsterry said:An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).
This is absolutely insane.0 -
NI will be using the teacher predicted grades for GCSE, the minister currently sticking to his guns (if you excuse the expression) on A Levels.
The required 30 MLA signatures have been received (supported by all parties except the DUP) to petition the speaker to recall the assembly
“New York has the haircuts, London has the trousers, but Belfast has the reason!0 -
When I were lad, some teachers included stuff not on the syllabus*, on account of it being part of the subject we were supposed to be learning about.Ben6899 said:
I found maths A level at about the right level (maths with mechanics and pure maths from memory), personally. In the first year at university (Civil Engineering at Leeds), we had to do maths again - the pure stuff - and going back over all this was... annoying. It was like being taught how to read again.rjsterry said:That being said, what is the material impact? I know my brother found that the easier maths A level course meant he found his first year of Mechanical engineering at Imperial a bit heavy going but once he caught up it was a non issue.
But it soon became clear why it was done. Others who'd left other sixth forms or colleges with A or B (entry requirement) in A level maths were simply not at the same level as some. They didn't make it to 2nd year - difficult to study English if you can't read.
So some levelling up was required then (1998) and from all the discussion these past few weeks/months, I am not sure that has happened. We could have squeezed in a further engineering module rather than going back over calculus...
That doesn't happen any more. A predictable consequence of having to students used as a tool to evaluate the teachers.
*we also used to dream of living in a coridoor.0 -
Coridoor!?First.Aspect said:
When I were lad, some teachers included stuff not on the syllabus*, on account of it being part of the subject we were supposed to be learning about.Ben6899 said:
I found maths A level at about the right level (maths with mechanics and pure maths from memory), personally. In the first year at university (Civil Engineering at Leeds), we had to do maths again - the pure stuff - and going back over all this was... annoying. It was like being taught how to read again.rjsterry said:That being said, what is the material impact? I know my brother found that the easier maths A level course meant he found his first year of Mechanical engineering at Imperial a bit heavy going but once he caught up it was a non issue.
But it soon became clear why it was done. Others who'd left other sixth forms or colleges with A or B (entry requirement) in A level maths were simply not at the same level as some. They didn't make it to 2nd year - difficult to study English if you can't read.
So some levelling up was required then (1998) and from all the discussion these past few weeks/months, I am not sure that has happened. We could have squeezed in a further engineering module rather than going back over calculus...
That doesn't happen any more. A predictable consequence of having to students used as a tool to evaluate the teachers.
*we also used to dream of living in a coridoor.
LUXURY!Ben
Bikes: Donhou DSS4 Custom | Condor Italia RC | Gios Megalite | Dolan Preffisio | Giant Bowery '76
Instagram: https://www.instagram.com/ben_h_ppcc/
Flickr: https://www.flickr.com/photos/143173475@N05/0 -
I'm surprised that expectations that the minister for education could formulate and execute even a cohesive and joined up policy that would not be front page news for the wrong reasons.
Chris ( i can't even win a rigged election) Grayling & Gavin Williamson, absolute muppets
The one current minister who does deliver and has a eye for detail is Gove.“Give a man a fish and feed him for a day. Teach a man to fish and feed him for a lifetime. Teach a man to cycle and he will realize fishing is stupid and boring”
Desmond Tutu0 -
Gotchakingstongraham said:
That's the bit that is insane. "We know someone is going to have a bad day, so bad luck Emma, we've decided it would have been you.".johngti said:
Why? How do you predict who’s going to have a bad day? We had some we were pretty confident were going to get grade 4 (high performing grammar school so about the lowest we ever get) so we put those in. What if theirs get downgraded? We had a 3 in the year group 3 years ago so...kingstongraham said:
"Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."TheBigBean said:
I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.rjsterry said:An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).
This is absolutely insane.
This year's surely should have been awarded based on having a good day, and if that means grade inflation, that's the way it is.
0 -
Would you mind reposting a link to that reportTheBigBean said:
I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.rjsterry said:An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).
Ta“New York has the haircuts, London has the trousers, but Belfast has the reason!0 -
CV + samples of work for selection for interview, then interview. Degree grade is one line in the CV and the samples of work tell you far more about the candidate's ability. At the interview you can then confirm that the work really is their work and that they have a good understanding of what they have produced.First.Aspect said:
How do you evaluate graduates?rjsterry said:I think it's interesting that so much weight is put on the grade. In my field, because you need to show examples of your work in any job interview scenario, there is much less reliance on the grade, because much stronger evidence of ability is right there in front of you.
1985 Mercian King of Mercia - work in progress (Hah! Who am I kidding?)
Pinnacle Monzonite
Part of the anti-growth coalition0 -
How can they show you examples of they are trying to get their first job, that's what I'm asking.rjsterry said:
CV + samples of work for selection for interview, then interview. Degree grade is one line in the CV and the samples of work tell you far more about the candidate's ability. At the interview you can then confirm that the work really is their work and that they have a good understanding of what they have produced.First.Aspect said:
How do you evaluate graduates?rjsterry said:I think it's interesting that so much weight is put on the grade. In my field, because you need to show examples of your work in any job interview scenario, there is much less reliance on the grade, because much stronger evidence of ability is right there in front of you.
0 -
Surely the predictions shouldn't have been based on a good day, or a bad day, but just a normal day for that candidate, based on what the teacher has learnt about the abilities and dedication of the pupil over the years they have been teaching them.kingstongraham said:
"Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."TheBigBean said:
I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.rjsterry said:An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).
This is absolutely insane.
Isn't this assumption that every pupil was going to have a good day (rather than a normal day) part of the whole problem?
0 -
Did the teachers predict grades or percentages? If it is grades, surely no algorithm in the world will be able to normalise the results fairly?0
-
Depends how you interpret 'good day'. I was reading it as being a day when they weren't hit by hay fever, family issues, a bad night's sleep etc. Normal, in other words.Dorset_Boy said:
Surely the predictions shouldn't have been based on a good day, or a bad day, but just a normal day for that candidate, based on what the teacher has learnt about the abilities and dedication of the pupil over the years they have been teaching them.kingstongraham said:
"Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."TheBigBean said:
I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.rjsterry said:An interesting video explaining just how inaccurate Ofqual's algorithm was.
https://youtu.be/wZODW080gsc
Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).
This is absolutely insane.
Isn't this assumption that every pupil was going to have a good day (rather than a normal day) part of the whole problem?0 -
If I understand correctly they put pupils in order and then they were allocated grades based upon previous years performance. This means that you got somebody else's bad day result.First.Aspect said:Did the teachers predict grades or percentages? If it is grades, surely no algorithm in the world will be able to normalise the results fairly?
This is so mad that I may have totally misunderstood.
if you have less than 5 kids in the class then teachers prediction was used, 5-15 some weight given to teacher prediction and 15+ you were at the mercy of the algorithm.0 -
It does seem like that is how it has happened, however crazy it sounds:
0 -
it's actually a little worse than that - see my worked example in I think the corona thread.surrey_commuter said:
If I understand correctly they put pupils in order and then they were allocated grades based upon previous years performance. This means that you got somebody else's bad day result.First.Aspect said:Did the teachers predict grades or percentages? If it is grades, surely no algorithm in the world will be able to normalise the results fairly?
This is so mad that I may have totally misunderstood.
if you have less than 5 kids in the class then teachers prediction was used, 5-15 some weight given to teacher prediction and 15+ you were at the mercy of the algorithm.0