Wednesday, December 29, 2010

Attack hits Anonymous activists

The notorious message board 4Chan has been taken offline by an overwhelming web attack.

Thanks to the attack, the discussion boards of the site have been hard to reach or offline for almost 24 hours.

The attack might be retaliation for similar attacks that some 4Chan members, as part of the Anonymous group, mounted in support of Wikileaks.

It is not yet clear who is carrying out the attacks and no-one has come forward to claim responsibility.

News about the large-scale web attack, known as a Distributed Denial of Service (DDoS) attack, came to light via a message posted on Twitter by Moot, the founder of 4Chan.

He wrote: "Site is down due to DDoS. We now join the ranks of Mastercard, Visa, Paypal, et al - an exclusive club!"

A DDoS attack involves bombarding a server behind a website with data in an attempt to knock it offline.

Many members of 4Chan work together in the guise of a group called Anonymous to carry out attacks on websites they deem to be enemies of freedom of speech.

Most recently, Anonymous members took action in support of whistle-blowing site Wikileaks. Anonymous used a DDoS tool to hit the corporate websites of Mastercard, Visa and Paypal because the firms had cut off payment connections to Wikileaks.

Paul Mutton, a security analyst at site watching firm Netcraft, said the attack on 4Chan was ongoing.

"For most of the past 24 hours, the site has either been very slow to respond or has been completely unreachable," he said.

Statistics gathered by Netcraft show 4Chan was hit hard early on Wednesday but that it recovered towards the afternoon.

A blog showing the status of the various elements of 4Chan suggests the image boards, the most heavily used part of the site, have been down for hours.

Early reports suggested that a hacktivist known as the Jester was behind the attack. Before now, some members of Anonymous said they would target the Jester with DDoS attacks after he declared an ambition to knock Wikileaks offline.

However, in a Twitter message, the Jester denied any involvement in the attack.



Online Business Consulting | Internet Business Consulting

Tuesday, December 28, 2010

Scientists aim to simulate Earth

It could be one of the most ambitious computer projects ever conceived.

An international group of scientists are aiming to create a simulator that can replicate everything happening on Earth - from global weather patterns and the spread of diseases to international financial transactions or congestion on Milton Keynes' roads.

Nicknamed the Living Earth Simulator (LES), the project aims to advance the scientific understanding of what is taking place on the planet, encapsulating the human actions that shape societies and the environmental forces that define the physical world.

"Many problems we have today - including social and economic instabilities, wars, disease spreading - are related to human behaviour, but there is apparently a serious lack of understanding regarding how society and the economy work," says Dr Helbing, of the Swiss Federal Institute of Technology, who chairs the FuturICT project which aims to create the simulator.

Knowledge collider

Thanks to projects such as the Large Hadron Collider, the particle accelerator built by Cern, scientists know more about the early universe than they do about our own planet, claims Dr Helbing.

What is needed is a knowledge accelerator, to collide different branches of knowledge, he says.

"Revealing the hidden laws and processes underlying societies constitutes the most pressing scientific grand challenge of our century."

The result would be the LES. It would be able to predict the spread of infectious diseases, such as Swine Flu, identify methods for tackling climate change or even spot the inklings of an impending financial crisis, he says.

But how would such colossal system work?

For a start it would need to be populated by data - lots of it - covering the entire gamut of activity on the planet, says Dr Helbing.

It would also be powered by an assembly of yet-to-be-built supercomputers capable of carrying out number-crunching on a mammoth scale.

Although the hardware has not yet been built, much of the data is already being generated, he says.

For example, the Planetary Skin project, led by US space agency Nasa, will see the creation of a vast sensor network collecting climate data from air, land, sea and space.

In addition, Dr Helbing and his team have already identified more than 70 online data sources they believe can be used including Wikipedia, Google Maps and the UK government's data repository Data.gov.uk.

Drowning in data

Integrating such real-time data feeds with millions of other sources of data - from financial markets and medical records to social media - would ultimately power the simulator, says Dr Helbing.

The next step is create a framework to turn that morass of data in to models that accurately replicate what is taken place on Earth today.

"Start Quote

We don't take any action on the information we have"

End Quote Pete Warden OpenHeatMaps

That will only be possible by bringing together social scientists and computer scientists and engineers to establish the rules that will define how the LES operates.

Such work cannot be left to traditional social science researchers, where typically years of work produces limited volumes of data, argues Dr Helbing.

Nor is it something that could have been achieved before - the technology needed to run the LES will only become available in the coming decade, he adds.

Human behaviour

For example, while the LES will need to be able to assimilate vast oceans of data it will simultaneously have to understand what that data means.

That becomes possible as so-called semantic web technologies mature, says Dr Helbing.

Today, a database chock-full of air pollution data would look much the same to a computer as a database of global banking transactions - essentially just a lot of numbers.

But semantic web technology will encode a description of data alongside the data itself, enabling computers to understand the data in context.

What's more, our approach to aggregating data stresses the need to strip out any of that information that relates directly to an individual, says Dr Helbing.

That will enable the LES to incorporate vast amounts of data relating to human activity, without compromising people's privacy, he argues.

Once an approach to carrying out large-scale social and economic data is agreed upon, it will be necessary to build supercomputer centres needed to crunch that data and produce the simulation of the Earth, says Dr Helbing.

Generating the computational power to deal with the amount of data needed to populate the LES represents a significant challenge, but it's far from being a showstopper.

If you look at the data-processing capacity of Google, it's clear that the LES won't be held back by processing capacity, says Pete Warden, founder of the OpenHeatMap project and a specialist on data analysis.

While Google is somewhat secretive about the amount of data it can process, in May 2010 it was believed to use in the region of 39,000 servers to process an exabyte of data per month - that's enough data to fill 2 billion CDs every month.

Reality mining

If you accept that only a fraction of the "several hundred exabytes of data being produced worldwide every year… would be useful for a world simulation, the bottleneck won't be the processing capacity," says Mr Warden.

"Getting access to the data will be much more of a challenge, as will figuring out something useful to do with it," he adds.

Simply having lots of data isn't enough to build a credible simulation of the planet, argues Warden. "Economics and sociology have consistently failed to produce theories with strong predictive powers over the last century, despite lots of data gathering. I'm sceptical that larger data sets will mark a big change," he says.

"It's not that we don't know enough about a lot of the problems the world faces, from climate change to extreme poverty, it's that we don't take any action on the information we do have," he argues.

Regardless of the challenges the project faces, the greater danger is not attempting to use the computer tools we have now - and will have in future - to improve our understanding of global socio-economic trends, says Dr Helbing.

"Over the past years, it has for example become obvious that we need better indicators than the gross national product to judge societal development and well-being," he argues.

At it's heart, the LES is about working towards better methods to measure the state of society, he says, which would account for health, education and environmental issues. "And last but not least, happiness."



Online Business Consulting | Internet Business Consulting

1m children 'without computers'

More than a million school children in the UK still lack access to a computer at home, research suggests.

And almost 2m are unable to go online at home, according to leading digital education charity, the E-Learning Foundation.

It also claims those from the poorest families are two-and-a-half-times less likely to have the internet at home than children from the richest homes.

The government would not comment on the findings.

The E-Learning Foundation, which works to ensure that all children have access to the internet and a computer at home, has analysed the latest government spending survey.

It found that while computer access is growing in better-off households, those from low-income families are being left behind.

'Get worse'

It is warning that many of the UK's poorest children face being severely educationally disadvantaged by their lack of access to technology as a result.

In November more than half of teachers who took part in a survey for the Times Education Supplement said pupils without access to internet or a computer at home were hampered in their learning.

The foundation's chief executive, Valerie Thompson, said: "With so many children swamped with gifts from family and friends over the Christmas period it is important we reflect on the fact that millions of children live in poverty in this country.

"For those at school, this translates into very tangible disadvantages when it comes to completing homework, researching topics, independent learning, and communicating with teachers and classmates on the school learning platform.

"Without the use of a computer and the ability to go online at home the attainment gap that characterises children from low income families is simply going to get worse."

The Department for Education was not prepared to comment on the findings.



Online Business Consulting | Internet Business Consulting

Sunday, December 26, 2010

Net satellite ready for lift-off

Europe is about to get a second satellite dedicated to delivering broadband internet connections.

The six-tonne Ka-Sat will be launched atop a Proton rocket from Baikonur in Kazakhstan in a flight expected to last nine hours and 12 minutes.

The Eutelsat-operated spacecraft will concentrate its services on customers in the so-called "not-spots" of Europe.

It is estimated that tens of millions of households in these areas cannot get a decent terrestrial connection.

Ka-Sat will provide homes with speeds generally up to 10Mbps.

Lift-off from Baikonur is timed for 0351 local time on Monday (2151 GMT on Sunday).

The spacecraft follows the Hylas-1 platform into orbit. This satellite, operated by Avanti Communications of London, was launched just last month.

Ka-Sat, however, is considerably bigger, and has a notional capacity to serve up to two million households compared with Hylas's 300,000.

Nevertheless, such is the scale of the under-served market in Europe that both platforms should be very profitable ventures, the two companies believe.

"As many as 30 million households in Europe are not served at all or get high mediocrity of service," said Eutelsat CEO Michel de Rosen.

"These could be people in the countryside or in the mountains, sometimes not very far from large cities. Ka-Sat is an answer to that problem," he told BBC News.

Paris-based Eutelsat is one of the world's big three Fixed Satellite Services (FSS) companies, and transmits thousands of TV channels across its fleet of spacecraft.

It already provides some internet capability on its existing platforms, but Ka-Sat is its first broadband-dedicated endeavour.

High throughput

Ka-Sat will be positioned about 36,000km above the equator at nine degrees east.

Its communications payload, structure and propulsion system were prepared by EADS Astrium at its UK facilities in Stevenage and Portsmouth.

Final testing of the spacecraft took place at Astrium's factory in Toulouse, France, before shipment to Baikonur.

Ka-Sat has a total throughput of some 70Gbps.

This will be channelled via 82 spot beams on to different market areas stretching from North Africa to southern Scandinavia. A very small segment of the Middle East will also be reached.

Eutelsat has signed about 70 deals with distributors across the satellite's "footprint", and more would be signed over the next year, said Mr de Rosen.

"It takes normally a few weeks for a satellite to become operational after launch," he explained.

"In this case, it is more likely to be a few months. Expect Ka-Sat to be operational in the second half of the second quarter of 2011."

Previous failure

Ka-Sat's Proton rocket will be under the spotlight for this launch.

The Russian vehicle failed on its last outing four weeks ago, dumping three Glonass satellite-navigation spacecraft in the Pacific Ocean.

An inquiry found the Proton's new Block DM-03 upper-stage had been over-fuelled, making it too heavy to achieve its required performance.

International Launch Services (ILS), which runs the commercial operations of the Proton vehicle, will be using a different upper-stage for the Ka-Sat mission.

This Breeze M stage has a good recent record.

It will be the eighth and last ILS-organised Proton mission of 2010.



Powered by WizardRSS | Work At Home Jobs

Friday, December 24, 2010

New solar fuel machine unveiled

A prototype solar device has been unveiled which mimics plant life, turning the Sun's energy into fuel.

The machine uses the Sun's rays and a metal oxide called ceria to break down carbon dioxide or water into fuels which can be stored and transported.

Conventional photovoltaic panels must use the electricity they generate in situ, and cannot deliver power at night.

Details are published in the journal Science.

The prototype, which was devised by researchers in the US and Switzerland, uses a quartz window and cavity to concentrate sunlight into a cylinder lined with cerium oxide, also known as ceria.

Ceria has a natural propensity to exhale oxygen as it heats up and inhale it as it cools down.

If as in the prototype, carbon dioxide and/or water are pumped into the vessel, the ceria will rapidly strip the oxygen from them as it cools, creating hydrogen and/or carbon monoxide.

Hydrogen produced could be used to fuel hydrogen fuel cells in cars, for example, while a combination of hydrogen and carbon monoxide can be used to create "syngas" for fuel.

It is this harnessing of ceria's properties in the solar reactor which represents the major breakthrough, say the inventors of the device. They also say the metal is readily available, being the most abundant of the "rare-earth" metals.

Methane can be produced using the same machine, they say.

Refinements needed

The prototype is grossly inefficient, the fuel created harnessing only between 0.7% and 0.8% of the solar energy taken into the vessel.

Most of the energy is lost through heat loss through the reactor's wall or through the re-radiation of sunlight back through the device's aperture.

But the researchers are confident that efficiency rates of up to 19% can be achieved through better insulation and smaller apertures. Such efficiency rates, they say, could make for a viable commercial device.

"The chemistry of the material is really well suited to this process," says Professor Sossina Haile of the California Institute of Technology (Caltech). "This is the first demonstration of doing the full shebang, running it under (light) photons in a reactor."

She says the reactor could be used to create transportation fuels or be adopted in large-scale energy plants, where solar-sourced power could be available throughout the day and night.

However, she admits the fate of this and other devices in development is tied to whether states adopt a low-carbon policy.

"It's very much tied to policy. If we had a carbon policy, something like this would move forward a lot more quickly," she told the BBC.

It has been suggested that the device mimics plants, which also use carbon dioxide, water and sunlight to create energy as part of the process of photosynthesis. But Professor Haile thinks the analogy is over-simplistic.

"Yes, the reactor takes in sunlight, we take in carbon dioxide and water and we produce a chemical compound, so in the most generic sense there are these similarities, but I think that's pretty much where the analogy ends."

Daniel Davies, chief technology officer at the British photovoltaic company Solar Century, said the research was "very exciting".

"I guess the question is where you locate it - would you put your solar collector on a roof or would it be better off as a big industrial concern in the Sahara and then shipping the liquid fuel?" he said.

Solar technology is moving forward apace but the overriding challenges remain ones of efficiency, economy and storage.

New-generation "solar tower" plants have been built in Spain and the United States which use an array of mirrors to concentrate sunlight onto tower-mounted receivers which drive steam turbines.

A new Spanish project will use molten salts to store heat from the Sun for up to 15 hours, so that the plant could potentially operate through the night.



Powered by WizardRSS | Work At Home Jobs

Thursday, December 23, 2010

Microsoft warns on IE browser bug

Microsoft has issued a warning about a serious vulnerability in all versions of its Internet Explorer (IE) browser.

If exploited by a booby-trapped webpage the bug would allow attackers to take control of an unprotected computer.

Code to exploit the bug has already been published though Microsoft said it had no evidence it was currently being used by hi-tech criminals.

A workaround for the bug has been produced while Microsoft works on a permanent fix.

Code injection

The bug revolves around the way that IE manages a computer's memory when processing Cascading Style Sheets - a widely used technology that defines the look and feel of pages on a website.

Hi-tech criminals have long known that they can exploit IE's memory management to inject their own malicious code into the stream of instructions a computer processes as a browser is being used. In this way the criminals can get their own code running and hijack a PC.

Microsoft has produced updates that improves memory management but security researchers discovered that these protection systems are not used when some older parts of Windows are called upon.

In a statement Microsoft said it was "investigating" the bug and working on a permanent fix. In the meantime it recommended those concerned use a protection system known as the Enhanced Mitigation Experience Toolkit.

Installing and applying the toolkit may require Windows XP users to update the version of the operating system they are using. But even if they do that some of the protection it bestows on Windows 7 and Vista users will not be available.

"We're currently unaware of any attacks trying to use the claimed vulnerability or of customer impact," said Dave Forstrom, the director of Microsoft's Trustworthy Computing group, in a statement.

"As vulnerabilities go, this kind is the most serious as it allows remote execution of code," said Rik Ferguson, senior security analyst at Trend Micro, "This means the attacker can run programs, such as malware, directly on the victim's computer."

He added: "It is highly reminiscent of a vulnerability at the same time two years ago which prompted several national governments to warn against using IE and to switch to an alternative browser."



Powered by WizardRSS | Work At Home Jobs

Wednesday, December 22, 2010

Skype apology for global blackout

Millions of people around the globe have been hit by an outage at the popular internet phone service Skype.

Users as far afield as Japan, Europe and the US have all reported problems.

The company which prides itself on providing relatively reliable service last suffered a major outage in 2007.

"We take outages like this really seriously and apologise for the inconvenience users are having," Tony Bates, Skype chief executive officer told BBC News.

"Right now it looks like clients are coming on and offline and sometimes they are crashing in the middle of calls. We are deep in the middle of investigating the cause of the problem and have teams working hard to remedy the situation," Mr Bates said.

On Skype's Twitter account, the company said their "engineers and site operations team are working non-stop to get things back to normal".

The news blog ReadWriteWeb said they have monitored complaints from users who reported that they are unable to log into the service and that the programme is crashing across all platforms, whether on their mobile device or PC.

Mr Bates did not rule in or rule out the possibility of a malicious attack and said "all avenues" were being explored.

He estimated that as a result of the outage, Skype has lost around 10 million calls.

Mr Bates told the BBC that normal call volume for the time of day would be 20m.

Om Malik, an industry commentator and editor of the Gigaom.com website, is not impressed.

"Skype is one of the key applications of the modern web," he said.

"It is already a hit with consumers, and over the past few years it has become part of the economic fabric for startups and small businesses around the world. I am not sure we can comprehend the productivity cost of this outage.

"The outage comes at a time when Skype is starting to ask larger corporations for their business. If I am a big business, I would be extremely cautious about adopting Skype for business, especially in light of this current outage," added Mr Malik.



Powered by WizardRSS | Work At Home Jobs

Web attacks plague rights sites

Human rights groups and campaigners are being hit hard by huge web attacks launched by those opposed to their views, finds research.

Many web-based campaigning groups are being knocked offline for weeks by the attacks, it found.

The researchers expect the tempo of attacks to increase as the tools and techniques become more widespread.

It urged human rights groups and independent media groups to beef up their defences to avoid falling victim.

Flash flood

The research by the Berkman Center for Internet and Society at Harvard University tried to get a sense of how often human rights groups and independent media organisations are hit by what is known as Distributed Denial of Service (DDoS) attacks.

DDoS attacks try to knock a site offline by overwhelming it with data.

In the 12 months between August 2009 and September 2010 the research found evidence of 140 attacks against more than 280 different sites. The report acknowledged that these were likely to be the most high profile attacks and that many more had probably gone unreported.

"These attacks do seem to be increasingly common," said Ethan Zuckerman, one of the authors of the report.

While some attacks were triggered by specific incidents such as elections others had no obvious cause, he said.

The report cites a sustained DDoS attack on Novaya Gazeta, the website of Russia's most liberal indepedent newspaper.

Deputy executive editor Sergey Sokolov isn't sure who attacked his website but suspects government-sponsored Kremlin Youth organisations.

The report finds that DDoS is increasingly being used as a political tool.

Attacks that recruit participants in so-called volunteer DDoS are proving popular

The report gives the example of the organisation 'Help Israel Win' which recently invited individuals to install a software package, dubbed Patriot DDos, on their computers so the machine could be used to launch attacks, on what the authors assume would be Palestinian targets.

The most recent example of a volunteer DDoS comes from Anonymous, a loose-knit group of activisits, who used the method to launch attacks on the websites of firms it perceived to be anti-Wikileaks.

DDoS attacks could hit small media groups and campaigners hard because the organisations have such limited resources, said Mr Zuckerman.

"If you are a human rights organisation or independent media organisation you might be using an account you are paying �20 a month for and its very hard at that level of hosting to fend off DDoS," he told the BBC.

The attacks did not have to be prolonged, he said, to cause real problems for small campaigning groups.

"Start Quote

There are certain attacks that seem to work if you have only one or two machines"

End Quote Ethan Zuckerman

"They just have to do it long enough to annoy their ISP and they will kick them off and then they have to find another place to host," said Mr Zuckerman.

Easy tools

The work of some groups only appears on the web, said Mr Zuckerman, so knocking them offline effectively silences the campaigners. It can take a long time for some to find a new host, upload content and re-build a site.

He said: "We see sites that do not come back online for two to three weeks."

The report also found that DDoS attacks are often only the most visible element of a much broader attack against a site or group.

"There's a very good chance that if you are experiencing DDoS you are being filtered, sent targeted e-mail to get access to your system or to snatch your passwords," he said.

Mr Zuckerman said some DDoS attacks logged in the report used hundreds or thousands of PCs in a botnet - networks of hijacked home computers - but others had just as big an effect with far fewer resources.

"There are certain attacks that seem to work if you have only one or two machines," he said.

What might cause problems in the future, he suggested, would be easy-to-use tools like those employed by Anonymous activists in support of Wikileaks.

"It seems like DDoS has become easier for more people to engage in," he said. "The threats do seem to be increasing."

In response, he said, rights groups needed to work hard to understand the threats and prepare in case they were hit.

"This community needs to get much, much smarter and much more knowledgeable," he said.



Powered by WizardRSS | Work At Home Jobs

Tuesday, December 21, 2010

US backs net traffic regulations

US regulators have approved new rules meant to prohibit broadband companies from interfering with internet traffic.

The Federal Communications Commission (FCC) voted 3-2 on the principle known as net neutrality; a tenet that ensures all web traffic is treated equally.

The rules have been criticised for setting different standards for fixed line broadband and mobile operators.

Officials said the regulations are "the first time the commission has adopted enforceable rules" to govern the web.

The FCC's three Democrats voted to pass the regulations, while the agency's two Republicans opposed them, arguing that they were unnecessary.

Tuesday's vote is the culmination of five years of fighting over how best to ensure the free flow of information in all its forms over the internet.

The FCC vote also comes at a time when consumers are increasingly accessing the web via smart phones and turning to the internet to watch TV shows.

'Rules of the road'

The commission's ability to regulate the internet was thrown into doubt following an appeals court decision earlier this year that said the agency lacked the authority to stop cable firm ComCast from blocking bandwidth-hogging applications.

The FCC said the vote addressed "basic rules of the road to preserve the open internet as a platform for innovation, investment, competition and free expression".

That is a view backed by chairman Julius Genachowski.

"We're adopting a framework that will increase certainty for businesses, investors and entrepreneurs," Mr Genachowski said in remarks prior to the vote.

"We're taking an approach that will help foster a cycle of massive investment, innovation and consumer demand both at the edge and in the core of our broadband networks."

Michael Copps, a Democrat, said in a written statement ahead of the vote that rules represented "an important milestone in the ongoing struggle to safeguard the awesome opportunity-creating power of the open internet".

The regulations are expected to be challenged in court.

'Squandered'

A number of interested parties including internet providers, developers and companies like Google have said the rules could provide some regulatory certainty going forward. Many have acknowledged that the regulations could have been much worse.

"Start Quote

I think today is a tremendously important day in the fight to preserve a free and open internet"

End Quote Aparna Sridhar Free Press

The new rules prohibit telecommunications companies that provide high-speed internet service from blocking access by customers to any legal content, applications or service.

But, for the first time, there is now a policy that will allow for what has been termed "paid-prioritisation", where companies will be able to pay for a faster service.

The FCC regulations place tougher restrictions on wired services from cable and phone companies than on wireless carriers, which have more limited bandwidth.

The vote comes amid increases in the amount of smart phones and tablet devices that are being used to access the web and watch TV shows.

The rules allow mobile firms to block access to sites or applications that specifically compete with a carrier's voice or video services.

Supporters of net neutrality feel the new regulations should have gone further and have slammed them as "fake net neutrality".

"I think today is a tremendously important day in the fight to preserve a free and open internet," Aparna Sridhar of advocacy group the Free Press, told BBC News.

"Chairman Genachowski has completely squandered a golden opportunity to make this vote meaningful. Until now we have had a certain amount of regulatory uncertainty, and the carriers have had an incentive to stay on their best behaviour.

Ms Sridhar added that the rules endorse "bad practices in the wireless space".

In an opinion piece for the Huffington Post, Al Franken, US Senator for Minnesota, earlier called the FCC vote "the most important free speech issue of our time".



Powered by WizardRSS | Work At Home Jobs

Patent spat threatens photo sites

The fallout from a patent dispute between Kodak and web photo site Shutterfly could embroil many online image sites, says patent experts.

Kodak claimed it owns patents regarding the display of online images that is being infringed by Shutterfly.

The photo-sharing site disputes these claims and has launched a counter suit.

But the landmark case could have ramifications for other popular online photo sites such as Yahoo's Flickr and Google's Picassa.

The past two years have seen a number of cases launched that claim online photo sites have breached patents.

But this is the first time such a large, established technology company has sought to assert its rights over online images, said Deborah Bould, a specialist in intellectual property at law firm Pinsent Masons.

Genuine innovation

Kodak's decision to start legal proceedings against Shutterfly will have put scores of web-based photo companies such as Flickr and Google, on high alert, she told BBC News.

"The patents Kodak holds are incredibly broad, effectively covering images that are stored centrally and can be ordered online," she said.

That's likely to mean Kodak will go after other online image sites it believes also infringe its patents, she added.

Kodak said it has over 400 similar patents.

"We are committed to protecting these assets from unauthorised use," it said in a statement.

Given the expense of patent cases, many smaller firms may choose to licence Kodak's technology rather than fight claims, said Theo Savvides, head of intellectual property at Osborne Clarke.

But firms such as Google and Yahoo "have deep pockets" that would allow them to challenge Kodak's claims, he added.

Such challenges would likely focus on the validity of Kodak's patents, said Ms Bould.

The case may hinge on Kodak's ability to show that when it filed the patents they covered technology that was genuinely innovative, she added.

Kodak has been hit hard by the shift towards digital photography, but has recently shown a greater willingness to assert its rights for technology it believes impinge on its patents.

Earlier this year Kodak said it would sue Apple and BlackBerry maker, Research in Motion, over technology used in their handsets.



Powered by WizardRSS | Work At Home Jobs