Mr Wallace said firms such as Facebook, Google and YouTube were too slow to remove radical content online, forcing the government to act instead.
While tech firms were “ruthless profiteers”, governments were spending millions policing the web, he added.
Facebook said Mr Wallace was wrong to say it put profits before safety.
YouTube said violent extremism was a “complex problem” and addressing it was a “critical challenge for us all”.
In an interview with the Sunday Times, Mr Wallace said tech giants were failing to help prevent the radicalisation of people online.
“Because content is not taken down as quickly as they could do,” he claimed, “we’re having to de-radicalise people who have been radicalised. That’s costing millions.”
He said the refusal of messaging services – such as WhatsApp, which is owned by Facebook – to give the security services access to message data was “turning the internet into an anarchic violent space”.
“Because of encryption and because of radicalisation, the cost of that is heaped on law enforcement agencies,” Mr Wallace told the newspaper.
By Emma Vardy, BBC political correspondent
This latest tax idea is a sign ministers are trying to explore all sorts of new ways to put pressure on tech firms.
How strong a tool it might be is unclear.
Tech giants’ revenues are so high, a tax or effectively a fine, may not immediately change the measures they are already taking, although it could play well with the public.
Google and Facebook have already stepped up their efforts to identify and take down extremist content, including using artificial intelligence to spot illegal content. But ministers want them to go further.
Meanwhile smaller sites like Telegram and WordPress are a more difficult problem.
They have done far less to tackle the use of their platforms by extremists.
Firms based overseas are almost completely out of reach of any measures imposed by the UK government, and in reality when authorities try to engage with them, they don’t get very far.
Mr Wallace said “the time for excuses is at an end” and the government should look at “all options” of incentivising firms – “including tax”.
“We should stop pretending that because they sit on beanbags in T-shirts they are not ruthless profiteers,” he said.
“They will ruthlessly sell our details to loans and soft-porn companies but not give it to our democratically elected government.”
‘Further and faster’
Simon Milner, policy director at Facebook, said Mr Wallace was wrong to say the company put profit before safety, especially in the fight against terrorism.
He said millions of pounds had been invested in people and technology to identify and remove terrorist content.
Home Secretary Amber Rudd and her European counterparts have welcomed coordinated efforts which were having a significant impact, he added.
“But this is an ongoing battle and we must continue to fight it together, indeed our CEO recently told our investors that in 2018 we will continue to put the safety of our community before profits.”
Facebook recently announced that 99% of content relating to so-called Islamic State or al-Qaeda was taken down before users had flagged it up.
‘No easy fix’
A YouTube spokesperson told the BBC it was committed to being “part of the solution” and was doing more every day to tackle the problem.
This year it had invested in machine-learning technology, recruited more reviewers of content and built partnerships with experts and collaborated with other companies, they added.
Google has yet to respond to Mr Wallace’s remarks.
However, speaking in September, Kent Walker, general counsel for Google, said tech firms would not be able to “do it alone”.
“We need people and we need feedback from trusted government sources and from our users to identify and remove some of the most problematic content out there.”
Dr Shiraz Maher, from the International Centre for the Study of Radicalisation at Kings College London, said: “Mobilisation to terrorism still involves real life, real world contacts so governments can’t solely blame tech companies.
“Also there is no easy fix, because what constitutes extremist material is subjective.”
Automated take-down systems have in the past led to lots of “useful” content being removed by mistake, he added.
“Syrian activists documenting war crimes by the regime and Islamic State have had their videos pulled from YouTube. Those videos will be needed in the future to bring criminals to account in The Hague.”