As of 2017, the electricity sector in the United Kingdom is powered by around 50% fossil fuel power, roughly 20% nuclear power, and 30% renewable power. Renewable power is showing strong growth, while fossil fuel generator use in general, and coal use in particular, is shrinking. Coal generators are now mainly run in winter due to pollution and costs. In 2008, nuclear electricity production was 860 kilowatt hours per person. In 2014, 28.1 terawatt-hours of energy was generated by wind power, which contributed 9.3 percent of the UK's electricity requirement. In 2015, 40.4 terawatt-hours of energy was generated by wind power, and the quarterly generation record was set in the three-month period from October to December 2015, with 13% of the nation's electricity demand met by wind power. Wind power contributed 15 percent of UK electricity generation in 2017 and 18.5 percent in the final quarter of 2017. The United Kingdom voluntarily ended the use of incandescent lightbulbs in 2011. Between 2007 and 2012, the UK's peak electrical demand fell from 61.5 gigawatts to 57.5 gigawatts. The use of electricity declined 11% in 2009 compared to 2004. The history of the National Grid in the United Kingdom dates back to the early 1900s. The first to use Nikola Tesla's three-phase high-voltage electric power distribution in the UK was Charles Mers of the Mers and McClellan consulting partnership at his Neptune Bank power station near Newcastle upon Tyne. This power station opened in 1901 and became the largest integrated power system in Europe by 1912. In 1925, the British government sought the help of Lord Weir, a Glaswegian industrialist, to solve the problem of Britain's inefficient and fragmented electricity supply industry. Weir consulted Mers, and the result was the Electricity Supply Act 1926, which recommended the creation of a national grid iron...