How 2 predict grades badly...
Description
Zoey and Bia discuss some of the mistakes that Ofqual made in their algorithm, how using “complicated” maths is not necessarily better, and share some anecdotes of their experiences with teachers and dealing with (un)conscious bias.
Timestamps
00:20 – Introduction
01:54 – Initial thoughts
02:42 – Mistake #1 – Their approach
04:43 – Mistake #2 – Data leakage
05:15 – Mistake #3 – Emphasis on the rank
06:57 – Mistake #4 – Ignoring outliers
08:31 – Mistake # 5 – No peer review
09:16 – Mistake #6 – Too precise
11:14 - Mistake #7 –Disregarded unconscious bias.
12:53 – Mistake #8: Education system in the UK.
13:30 – Ofqual considered edge cases – (almost a positive thing!)
15:00 – How we might have handled this situation
17:39 – Another example of algorithmic bias – Accounting system the Post Office used.
18:53 – Challenge: “Prison Break”. This based on “Liar’s paradox” attributed to Epimenides (amongst many other philosophers). For more challenges, presented in a more visual manner, check out our Instagram.
25:52 – Anecdotes of experiencing bias from teachers.
Useful links:
Ofqual’s report
Bristol University's study on unconscious bias - http://www.bris.ac.uk/media-library/sites/cmpo/migrated/documents/wp221.pdf
Tom SF Haines’ post (Lecturer in Machine Learning at Bath University) - http://thaines.com/post/alevels2020
Imagine you and a stranger are paired together for a little game. Now there’s some money up for grabs and you’re both given 2 choices; Share or Snake.
If you both share you both win £15 each
If one of you shares and one of you snakes, the snake will win £50 leaving the person who chose share...
Published 06/30/21
How do dating apps work? And what are your thoughts on them? In this episode, Zoey shares how collaborative filtering works in dating apps such as Tinder, but also in Amazon. Bia shares how Hinge uses the Gale-Shapley algorithm (whilst butchering the pronunciation) to find your most compatible...
Published 04/11/21