Exam Grades

1356714

Comments

  • First.Aspect
    First.Aspect Posts: 14,626

    For what it is worth, I don't really care with A-Levels - I knew mine were much easier at the time. I object to the drop in degree standards as mine wasn't easy.

    The drop in degree standards is because there are more degrees. The top degrees in credible subjects are just as hard and competitive as they ever were.

    The "degrees" that are what we would have considered to be entirely different types of qualifications, but in a fancy frock, or the playschool subjects cobbled together and padded out to make them a "degree" drag the entire system down.

    You cannot have degrees taken at twice as many institutions and three times as many people, and expect the bottom half to be of the same standard.
  • TheBigBean
    TheBigBean Posts: 20,596

    For what it is worth, I don't really care with A-Levels - I knew mine were much easier at the time. I object to the drop in degree standards as mine wasn't easy.

    The drop in degree standards is because there are more degrees. The top degrees in credible subjects are just as hard and competitive as they ever were.

    The "degrees" that are what we would have considered to be entirely different types of qualifications, but in a fancy frock, or the playschool subjects cobbled together and padded out to make them a "degree" drag the entire system down.

    You cannot have degrees taken at twice as many institutions and three times as many people, and expect the bottom half to be of the same standard.
    Even the top ones hand out much higher grades.
  • TheBigBean
    TheBigBean Posts: 20,596
    edited August 2020
    E.g. UCL has gone from 24% to 40% firsts in seven years.

    https://www.bbc.co.uk/news/education-48951653
  • rjsterry
    rjsterry Posts: 27,638
    I think it's interesting that so much weight is put on the grade. In my field, because you need to show examples of your work in any job interview scenario, there is much less reliance on the grade, because much stronger evidence of ability is right there in front of you.
    1985 Mercian King of Mercia - work in progress (Hah! Who am I kidding?)
    Pinnacle Monzonite

    Part of the anti-growth coalition
  • rjsterry
    rjsterry Posts: 27,638
    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).
    1985 Mercian King of Mercia - work in progress (Hah! Who am I kidding?)
    Pinnacle Monzonite

    Part of the anti-growth coalition
  • First.Aspect
    First.Aspect Posts: 14,626
    rjsterry said:

    I think it's interesting that so much weight is put on the grade. In my field, because you need to show examples of your work in any job interview scenario, there is much less reliance on the grade, because much stronger evidence of ability is right there in front of you.

    How do you evaluate graduates?
  • TheBigBean
    TheBigBean Posts: 20,596
    rjsterry said:

    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).

    I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.
  • rjsterry said:

    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).

    I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.
    If not Paul Johnson has spent the weekend reading it and has a great piece in today’s Times
  • kingstongraham
    kingstongraham Posts: 26,230

    rjsterry said:

    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).

    I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.
    "Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."

    This is absolutely insane.
  • Ben6899
    Ben6899 Posts: 9,686
    rjsterry said:

    That being said, what is the material impact? I know my brother found that the easier maths A level course meant he found his first year of Mechanical engineering at Imperial a bit heavy going but once he caught up it was a non issue.

    I found maths A level at about the right level (maths with mechanics and pure maths from memory), personally. In the first year at university (Civil Engineering at Leeds), we had to do maths again - the pure stuff - and going back over all this was... annoying. It was like being taught how to read again.

    But it soon became clear why it was done. Others who'd left other sixth forms or colleges with A or B (entry requirement) in A level maths were simply not at the same level as some. They didn't make it to 2nd year - difficult to study English if you can't read.

    So some levelling up was required then (1998) and from all the discussion these past few weeks/months, I am not sure that has happened. We could have squeezed in a further engineering module rather than going back over calculus...
    Ben

    Bikes: Donhou DSS4 Custom | Condor Italia RC | Gios Megalite | Dolan Preffisio | Giant Bowery '76
    Instagram: https://www.instagram.com/ben_h_ppcc/
    Flickr: https://www.flickr.com/photos/143173475@N05/
  • rjsterry said:

    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).

    Did they enlist Neil Ferguson for help with this modelling?
  • johngti
    johngti Posts: 2,508

    rjsterry said:

    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).

    I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.
    "Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."

    This is absolutely insane.
    Why? How do you predict who’s going to have a bad day? We had some we were pretty confident were going to get grade 4 (high performing grammar school so about the lowest we ever get) so we put those in. What if theirs get downgraded? We had a 3 in the year group 3 years ago so...
  • kingstongraham
    kingstongraham Posts: 26,230
    johngti said:

    rjsterry said:

    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).

    I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.
    "Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."

    This is absolutely insane.
    Why? How do you predict who’s going to have a bad day? We had some we were pretty confident were going to get grade 4 (high performing grammar school so about the lowest we ever get) so we put those in. What if theirs get downgraded? We had a 3 in the year group 3 years ago so...
    That's the bit that is insane. "We know someone is going to have a bad day, so bad luck Emma, we've decided it would have been you.".

    This year's surely should have been awarded based on having a good day, and if that means grade inflation, that's the way it is.
  • rjsterry said:

    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).

    I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.
    "Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."

    This is absolutely insane.
    Garbage in = Garbage out (GIGO) is a well known computer science and mathematics concept
  • kingstongraham
    kingstongraham Posts: 26,230

    rjsterry said:

    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).

    I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.
    "Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."

    This is absolutely insane.
    Garbage in = Garbage out (GIGO) is a well known computer science and mathematics concept
    It is, but I don't see why Ofqual have introduced a random garbage factor.
  • tailwindhome
    tailwindhome Posts: 18,932
    NI will be using the teacher predicted grades for GCSE, the minister currently sticking to his guns (if you excuse the expression) on A Levels.

    The required 30 MLA signatures have been received (supported by all parties except the DUP) to petition the speaker to recall the assembly

    “New York has the haircuts, London has the trousers, but Belfast has the reason!
  • First.Aspect
    First.Aspect Posts: 14,626
    Ben6899 said:

    rjsterry said:

    That being said, what is the material impact? I know my brother found that the easier maths A level course meant he found his first year of Mechanical engineering at Imperial a bit heavy going but once he caught up it was a non issue.

    I found maths A level at about the right level (maths with mechanics and pure maths from memory), personally. In the first year at university (Civil Engineering at Leeds), we had to do maths again - the pure stuff - and going back over all this was... annoying. It was like being taught how to read again.

    But it soon became clear why it was done. Others who'd left other sixth forms or colleges with A or B (entry requirement) in A level maths were simply not at the same level as some. They didn't make it to 2nd year - difficult to study English if you can't read.

    So some levelling up was required then (1998) and from all the discussion these past few weeks/months, I am not sure that has happened. We could have squeezed in a further engineering module rather than going back over calculus...
    When I were lad, some teachers included stuff not on the syllabus*, on account of it being part of the subject we were supposed to be learning about.

    That doesn't happen any more. A predictable consequence of having to students used as a tool to evaluate the teachers.

    *we also used to dream of living in a coridoor.
  • Ben6899
    Ben6899 Posts: 9,686

    Ben6899 said:

    rjsterry said:

    That being said, what is the material impact? I know my brother found that the easier maths A level course meant he found his first year of Mechanical engineering at Imperial a bit heavy going but once he caught up it was a non issue.

    I found maths A level at about the right level (maths with mechanics and pure maths from memory), personally. In the first year at university (Civil Engineering at Leeds), we had to do maths again - the pure stuff - and going back over all this was... annoying. It was like being taught how to read again.

    But it soon became clear why it was done. Others who'd left other sixth forms or colleges with A or B (entry requirement) in A level maths were simply not at the same level as some. They didn't make it to 2nd year - difficult to study English if you can't read.

    So some levelling up was required then (1998) and from all the discussion these past few weeks/months, I am not sure that has happened. We could have squeezed in a further engineering module rather than going back over calculus...
    When I were lad, some teachers included stuff not on the syllabus*, on account of it being part of the subject we were supposed to be learning about.

    That doesn't happen any more. A predictable consequence of having to students used as a tool to evaluate the teachers.

    *we also used to dream of living in a coridoor.
    Coridoor!?

    LUXURY!
    Ben

    Bikes: Donhou DSS4 Custom | Condor Italia RC | Gios Megalite | Dolan Preffisio | Giant Bowery '76
    Instagram: https://www.instagram.com/ben_h_ppcc/
    Flickr: https://www.flickr.com/photos/143173475@N05/
  • slowmart
    slowmart Posts: 4,480
    I'm surprised that expectations that the minister for education could formulate and execute even a cohesive and joined up policy that would not be front page news for the wrong reasons.

    Chris ( i can't even win a rigged election) Grayling & Gavin Williamson, absolute muppets

    The one current minister who does deliver and has a eye for detail is Gove.
    “Give a man a fish and feed him for a day. Teach a man to fish and feed him for a lifetime. Teach a man to cycle and he will realize fishing is stupid and boring”

    Desmond Tutu
  • johngti
    johngti Posts: 2,508

    johngti said:

    rjsterry said:

    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).

    I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.
    "Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."

    This is absolutely insane.
    Why? How do you predict who’s going to have a bad day? We had some we were pretty confident were going to get grade 4 (high performing grammar school so about the lowest we ever get) so we put those in. What if theirs get downgraded? We had a 3 in the year group 3 years ago so...
    That's the bit that is insane. "We know someone is going to have a bad day, so bad luck Emma, we've decided it would have been you.".

    This year's surely should have been awarded based on having a good day, and if that means grade inflation, that's the way it is.
    Gotcha

  • tailwindhome
    tailwindhome Posts: 18,932

    rjsterry said:

    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).

    I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.
    Would you mind reposting a link to that report
    Ta
    “New York has the haircuts, London has the trousers, but Belfast has the reason!
  • rjsterry
    rjsterry Posts: 27,638

    rjsterry said:

    I think it's interesting that so much weight is put on the grade. In my field, because you need to show examples of your work in any job interview scenario, there is much less reliance on the grade, because much stronger evidence of ability is right there in front of you.

    How do you evaluate graduates?
    CV + samples of work for selection for interview, then interview. Degree grade is one line in the CV and the samples of work tell you far more about the candidate's ability. At the interview you can then confirm that the work really is their work and that they have a good understanding of what they have produced.
    1985 Mercian King of Mercia - work in progress (Hah! Who am I kidding?)
    Pinnacle Monzonite

    Part of the anti-growth coalition
  • First.Aspect
    First.Aspect Posts: 14,626
    rjsterry said:

    rjsterry said:

    I think it's interesting that so much weight is put on the grade. In my field, because you need to show examples of your work in any job interview scenario, there is much less reliance on the grade, because much stronger evidence of ability is right there in front of you.

    How do you evaluate graduates?
    CV + samples of work for selection for interview, then interview. Degree grade is one line in the CV and the samples of work tell you far more about the candidate's ability. At the interview you can then confirm that the work really is their work and that they have a good understanding of what they have produced.
    How can they show you examples of they are trying to get their first job, that's what I'm asking.
  • Dorset_Boy
    Dorset_Boy Posts: 6,918

    rjsterry said:

    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).

    I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.
    "Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."

    This is absolutely insane.
    Surely the predictions shouldn't have been based on a good day, or a bad day, but just a normal day for that candidate, based on what the teacher has learnt about the abilities and dedication of the pupil over the years they have been teaching them.

    Isn't this assumption that every pupil was going to have a good day (rather than a normal day) part of the whole problem?


  • First.Aspect
    First.Aspect Posts: 14,626
    Did the teachers predict grades or percentages? If it is grades, surely no algorithm in the world will be able to normalise the results fairly?
  • kingstongraham
    kingstongraham Posts: 26,230

    rjsterry said:

    An interesting video explaining just how inaccurate Ofqual's algorithm was.

    https://youtu.be/wZODW080gsc

    Tl;dr it was only about 60% accurate when applied to 2019 data and they published this in their own report. (page 204 if anyone wants to look).

    I have been encouraging people to read the report. The executive summary is not that longer, but it is more than 140 characters.
    "Almost all the teachers we interviewed told us that they had generally predicted how the students would perform on a ‘good day’. Although they knew that every year some students underperform or have a bad day, this was not the basis of their judgements. This might be as expected, but the cumulative effect of this optimism, if reflected in the final results, would have undermined confidence in those results."

    This is absolutely insane.
    Surely the predictions shouldn't have been based on a good day, or a bad day, but just a normal day for that candidate, based on what the teacher has learnt about the abilities and dedication of the pupil over the years they have been teaching them.

    Isn't this assumption that every pupil was going to have a good day (rather than a normal day) part of the whole problem?


    Depends how you interpret 'good day'. I was reading it as being a day when they weren't hit by hay fever, family issues, a bad night's sleep etc. Normal, in other words.
  • Did the teachers predict grades or percentages? If it is grades, surely no algorithm in the world will be able to normalise the results fairly?

    If I understand correctly they put pupils in order and then they were allocated grades based upon previous years performance. This means that you got somebody else's bad day result.

    This is so mad that I may have totally misunderstood.

    if you have less than 5 kids in the class then teachers prediction was used, 5-15 some weight given to teacher prediction and 15+ you were at the mercy of the algorithm.
  • kingstongraham
    kingstongraham Posts: 26,230
    It does seem like that is how it has happened, however crazy it sounds:

  • rick_chasey
    rick_chasey Posts: 72,612

    Did the teachers predict grades or percentages? If it is grades, surely no algorithm in the world will be able to normalise the results fairly?

    If I understand correctly they put pupils in order and then they were allocated grades based upon previous years performance. This means that you got somebody else's bad day result.

    This is so mad that I may have totally misunderstood.

    if you have less than 5 kids in the class then teachers prediction was used, 5-15 some weight given to teacher prediction and 15+ you were at the mercy of the algorithm.
    it's actually a little worse than that - see my worked example in I think the corona thread.