The content in this page is not produced by Prachatai staff. Prachatai merely provides a platform, and the opinions stated here do not necessarily reflect those of Prachatai.

Testing the Test

The Ordinary National Education Test (ONET) results this year are as disappointing as in previous years.  Almost as disappointing as the wilful ignorance that produced the tests and the sadly misinformed comments on them in the media.

Let us take the Prathom 6 English test as an example. For kids who have in all likelihood been taking multiple-choice tests since pre-kindergarten, it starts by helpfully showing them how to answer this kind of question:

‘Directions: Choose the correct answer.

‘Example

‘Item 0: Which province is in the south of Thailand?

‘1. Yala  2. Maehongson  3. Samutsongkram  4. Nakornratchasima

‘The correct answer is 1. Therefore, you must darken the circle with the number 1 in it.’

Remember, this is a test of students’ competence in English. Can you be competent in English at the level required of primary school students and still get this question wrong?

Of course you can.  I just asked a native-speaker PhD who hadn’t got a clue, mainly because a knowledge of the geography of the provinces of Thailand is not a matter of English language competence.

And is it possible to know where Yala is and still get the wrong answer?  Yes, if your English is so weak that you can’t understand the question.

(And by the way, the question writer would be advised to consult the Royal Thai General System of Transcription of the Office of the Royal Society of Thailand on the officially correct spelling of the names of provinces, because whoever wrote this question got ‘Yala’ correct but all the rest wrong.  Only 25%.  And a test-writer, too.  Tsk, tsk.)

So what is this item testing?

The student who got the correct answer would be credited with enough language competence to understand the question and sufficient geographical knowledge about 4 of Thailand’s 76 provinces.

Lots of the questions on the ONET tests combined a knowledge of English with something else.  Take question 7, for example.

‘On October 31, children celebrate Halloween. What do they usually do?

1. They splash water on each other.

2. They send handmade cards to their fathers.

3. They give jasmine flowers to their mothers.

4. They dress like ghosts.’

This expects students to know about customs in some parts of the English-speaking world (but not the one I grew up in, so when I was at Prathom 6 age I would have failed).

Question 15 has a table of heights and weights and the choices are:

1. Megan is taller than Daniel, but lighter than Thomas.

2. Thomas is taller than Megan, but lighter than Daniel.

3. Thomas is shorter than Daniel, but heavier than Megan.

4. Megan is shorter than Thomas, but heavier than Daniel.

This involves the ability to read information from a (fairly simple) table as well to understand the English.  A student who gets the wrong answer may have excellent English but just no idea about processing information.

For question after question, competence in English alone is not enough to know the correct answer.  So there will be students who do know English, but not the other stuff and they will be marked wrong.  On an English test.

So this is a test of English plus all sorts of other stuff and to pass you need to know more than just English.

Or do you?

It’s a multiple choice test, remember.  You can in fact pass without knowing any English at all.  With the right degree of luck, you could even get 100%.  The chance of this happening is extremely remote, but not impossible.

So how can a student’s score be interpreted?  Media reports uniformly assume that 50% is the threshold between passing and failing, and as far as I can see the National Institute of Educational Testing Service (NIETS) that produces these tests is saying nothing to correct this view.

And it is complete bullshit.

First, think about what guessing does to a student’s score.  There are 4 choices per question, so each time the P6 student doesn’t know anything, there is a 25% chance of getting the right answer by guessing.  So the greatest probability for the know-nothing student is a score of 25%.  If you get less than that, you either have terrible luck or you are too stupid to know that guessing can’t lose.

So the minimum score is not zero but most probably 25%.  Half way between this minimum and maximum is not 50% but 62.5%.

But why is half-way the pass score?  Is the competence needed to get a pass score carefully correlated to curriculum standards?  No.  It turns out that most NIETS’ test-writers don’t even know what is on the curriculum.  They’re highly educated university teachers who may not know what is taught in schools.

And does 50% this year mean the same as last year?  Or next year?  To ensure that the difficulty of this year’s test is the same as other years requires a lot of trials with a huge item bank and a careful calibration of the difficulty of each item.  And guess what?  NIETS doesn’t bother doing that.  Each year, a bunch of ‘expert’ test-writers (who may be experts in the subject but not experts at test-writing) concoct a new set of questions and no one has the slightest idea if the level of difficulty has changed.

So when the average scores improve, this cannot be taken as proof that the competence of students has improved.  It could simply be that this year’s test was a bit easier.

So that’s nice.  An English test that doesn’t just test English and isn’t based on what has been taught and where the results can’t be compared from year to year.

And everyone from the PM down tut-tuts and blames the kids.

No, blame the clueless test-writers.


About author:  Bangkokians with long memories may remember his irreverent column in The Nation in the 1980's. During his period of enforced silence since then, he was variously reported as participating in a 999-day meditation retreat in a hill-top monastery in Mae Hong Son (he gave up after 998 days), as the Special Rapporteur for Satire of the UN High Commission for Human Rights, and as understudy for the male lead in the long-running ‘Pussies -not the Musical' at the Neasden International Palladium (formerly Park Lane Empire).