PTE Re-tell Lecture 真实讲座练习题:运算法则有偏见吗?

在PTE中,无论是Summarise Spoken Text 还是 Re-tell Lecture的考题大都是从真实的讲座或者演讲中截取的中间经常经常夹杂很多不同的环境音.很多同学都反映有时未必是听不懂,而是听不到. 鉴于此,墨尔本悉尼文波雅思PTE专门为大家总结了真实讲座的PTE练习音频,相比新闻音频来说,整体更加接近PTE考试的真题,内容方面,我们也会为大家提供考试中存在的近似题,最近我们会持续更新,敬请期待!


It was this thing with Facebook, right? Whether whether they was bias in the algorithm. You know, that I can set the stage, you know the case better, but, so, you know, did this aspect of Facebook’s News fear? Whether humans evolved in changing new order of how prominent something is. And people were guess humans might play a role because they all think that Facebook should just be using unbiased algorithms. But the trouble is there is no such thing that is unbiased algorithm, so I am not really going to common on the details and situations that I don’t really know exactly what was done. But I think it’s wise to say that something is obvious to those of us working in the field is that there is no such thing that something is unbiased, so from an AI perspective there’s there’s something called no free lunch theory. And this no free lunch theory says that you can’t have one algorithm that is universally good at everything, that every algorithm is going to have its strengths and weaknesses. I run an algorithms company now. We’re trying to find out algorithms that we think fit in the really world in a particular way. But our algorithms are universal, they’re not like a universal solvent, they can dissolve everything, they are good for some kind of problems, but not others. Well, all our algorithms are like that. And if you’re talking about using an algorithm with respect to human beings, there’s always going to be some kind of bias, so in the first thing, we have to do with programmers, and you know, CEOs and so forth, who understand that, who understand that there is no kind of objective truth out there, whatever algorithm you pick, there is something in there that is gonna bias one way or another. You’re trying to like minimize that bias, minimise, you know, certain kinds of variants, but there is no absolutely truth, there is no way of taking humans out. Humans made algorithms in the first place, they made decisions about which data to collect and how to count them. Once you made these decisions of what you are counting, you know, what things you put together and category, there’s already biased.

发表回复

您的电子邮箱地址不会被公开。 必填项已用 * 标注