BRISBANE, Aug 10 - Clive Liebmann schools people in how to pass English tests the immigration department requires when assessing visa applications, and believes one - the Pearson Test of English (PTE) Academic - is flawed and open to manipulation.

He says PTE - a computer-based, machine-scored test - is capable of handing inflated scores to people with low English proficiency, and low scores to those with high proficiency.

"The government talks about wanting high-level English speakers, through the visa system. That's fair enough. But what's the point in the government using a test that produces - from my students' experience - often wildly inconsistent results," the Melbourne teacher said.

Liebmann made the comments after highly-educated test takers said they'd essentially flunked the speaking component of PTE - the only government test that uses voice recognition technology.

Candidates' audio recordings are marked by Pearson's "scoring engine", trained to identify acceptable and unacceptable answers. Other test providers use human assessors to test speaking ability.

"In my experience, a student could include some confidently-spoken gibberish in their response, mixed with some topic-specific vocabulary. Even if not used logically, if their fluency, intonation, stress, and speed is good, they could still get a high score," he said.

"On the other hand many students who go in and speak very well, in a natural, real-world way, could end up getting a low score because they're not giving the computer what it's good at."

This week Irish woman Louise Kennedy - who has spent the past two years working as a vet in Queensland and who holds degrees in veterinary medicine, and politics/history- revealed she'd failed to achieve a PTE speaking score high enough to renew her working visa. Other similar cases have surfaced.

"If people are touched by the story of a young Irish vet, then consider the plight of a non-native speaker who has worked diligently on their English their whole life, only to be rejected due to the inaccuracy of the exam," Liebmann said.

He said many of his students had done both the PTE test and the other predominant test, IELTS, which uses human assessors, with inconsistent scores.

While PTE technology was good at assessing some things, such as pitch, volume, and stress, it struggled to deal with complex grammar and arguments, abstract thought, and even natural pauses in speaking patterns, he said.

"You need to give the computer what makes it happy. There is an entire niche industry that's sprung up, focusing on tips and tricks - those are the buzz words - to fool the computer,"  Liebmann said.

Pearson has defended its product and says even native English speakers shouldn't assume they will get a perfect score. It says it's better to trust computer-based assessment because it removes factors that can influence human assessors, such as cultural-linguistic bias.