Friday, 22 December 2017
Labour and the Conservatives do not agree on much, but both go along with regressive changes to benefits policy, the very poorest people in our communities are going to see big cuts in their already meagre living standards. The changes to benefits policy are massively bad news for the 50%+ of families in our country who receive income from at least one state benefit. The Institute for Fiscal Studies (IFS), says that the Conservatives’ plans will reduce the net incomes of households in the bottom income decile by a tenth. The incomes of working-age families with children in the bottom decile could end up a full 15% lower - that's a big cut when you've next to nothing. Labour folk tell me they are offering a radical alternative, but let’s look at facts behind the hyperbole. The distributional impact of Labour’s tax-and-benefits policy is almost the same as the Conservatives. Labour has committed only £4.5bn saying cancelling the benefits freeze entirely is unaffordable, yet Labour has pledged about £10bn to remove university-tuition fees, a policy that heavily benefits the better-off. Now it’s not a fashionable or a vote winning thing to put families on benefits over students – it is however the right thing to do. http://www.centreforwelfarereform.org/news/the-uk-gover-more-unfair/00351.html
Posted by KRA at 13:23
Wednesday, 13 December 2017
An unwelcome surprise for the new year is an planning application for 255 houses west of Higher Exeter Road (Planning Application 17/024880/MAJ) I strongly opposed the decision of Teignbridge District Council planning committee on 29th July 2014 to grant outline planning permission (14/00447/MAJ). Along with residents I was very concerned about the impact on the landscape, ecology and particularly traffic issues. I was particular concerned that the developers had included land in this application which is outside the designated TE3 in Plan Teignbreidge - the Local Plan. TE3 has been earmarked for residential development including “on-site provision of formal and informal recreation areas”. However in the application 14/00447/MAJ two recreational areas were proposed outside the north-west and south-west boundaries of the TE3 site. My reading of Plan Teignbridge is that the TE3 policy states that all the residential AND recreational facilities should be contained within the TE3 boundary. (page 125 https://www.teignbridge.gov.uk/media/1669/local-plan-2013-33.pdf. ) Whilst it has to be accepted that Higher Exeter Road is identified for housing development in TE3 (on page 125 of Plan Teignbridge). It doesn’t mean 255 houses on this site is a forgone conclusion, albeit they will have to be some development. The planers should give Policy EN2 Undeveloped Coast (page 61 of Plan Teignbridge) greater weight in deciding this application. EN2 says: “The protection, maintenance and enhancement of the distinctive landscape and seascape character and ecological qualities of the undeveloped coast, will be a priority alongside the ecological and biodiversity considerations. Development which would have a detrimental effect on the character of the undeveloped coast and estuaries will not be permitted.” Planning Reasons for Objection: Design: The building of 255 houses in this elevated site would be overbearing and dominant in the landscape and have a major detrimental effect on the landscape (EN2 Undeveloped Coast). . Traffic and Access: The concerns about the access for traffic on/off the B3192 (Higher Exeter Road) have not been adequately addressed. The future residents on this site are likely to have between 500 – 600 cars, a recipe for gridlock on the Exeter Road as bad as the Bitton Park Road. Drainage and Flooding: The valley has many springs and streams, this used to be Teignmouth’s water supply. The valley also absorbs a lot of rain and surface water, the impact of this run-off could cause a high flood risk for homes downhill.
Posted by KRA at 08:14
Friday, 8 December 2017
The Bullet Cluster consists of two colliding clusters of galaxies (well strictly speaking, the name Bullet Cluster refers to the smaller sub-cluster, moving away from the larger one). Gravitational lensing* studies of the Bullet Cluster are claimed to provide the best evidence for the existence of particle dark matter. But the Bullet Cluster isn’t the incontrovertible evidence for particle dark matter that we have been told it is; it’s possible to explain the Bullet Cluster with models of modified gravity. Modifying gravity works by introducing additional fields that are coupled to gravity. There’s no reason that, in a dynamical system, these fields have to be focused at the same place where the normal matter is. Modified gravity should have a path dependence that leads to such a delocalisation as is observed in this, and other, cluster collisions. https://arxiv.org/abs/1003.0939 *A gravitational lens is a distribution of matter (such as a cluster of galaxies) between a distant light source and an observer, that is capable of bending the light from the source as the light travels towards the observer. This effect is known as gravitational lensing, and the amount of bending is one of the predictions of Albert Einstein's general theory of relativity.
Posted by KRA at 18:23
The Beach clean team are celebrating the '
Posted by KRA at 17:40
Monday, 4 December 2017
The gravitational wave detection reported in August, GW170817, was accompanied by electromagnetic radiation. Both signals arrived on Earth almost simultaneously, within a time-window of a few seconds. The gravitational-wave event GW170817 was observed by the Advanced LIGO and Virgo detectors, and the gamma-ray burst (GRB) GRB 170817A was observed independently by the Fermi Gamma-ray Burst Monitor, and the Anticoincidence Shield for the Spectrometer for the International Gamma-Ray Astrophysics Laboratory. https://arxiv.org/ftp/arxiv/papers/1710/1710.05834.pdf
Posted by KRA at 13:59
Sunday, 3 December 2017
Gian Francesco Giudice is a theoretical physicist at CERN (where my brother Andrew did research); the research activity of Giudice mainly deals with the formulation of new theories that extend our present knowledge of the particle world toward even smaller distances. He is also studying how these theories can be applied to cosmology in order to describe the early stages of the history of our universe. His most notable results are in the areas of supersymmetry, extra dimensions, electroweak physics, collider physics, dark matter, and leptogenesis. Giudice is clear in the Dawn of the Post-Naturalness Era (In an imaginary conversation with Guido Altarelli), he still holds that “naturalness is a well-defined concept.” Dr Sabine Hossenfelder believes this is wrong, or rather if you make naturalness well-defined, it becomes meaningless. Dr Sabine Hossenfelder's argument goes as follows. Naturalness in quantum field theories – ie, theories of the type of the standard model of particle physics – means that a theory at low energies does not sensitively depend on the choice of parameters at high energies. People say this means that “the high-energy physics decouples.” However that changing the parameters of a theory is not a physical process - the parameters are whatever they are. The processes that are physically possible at high energies decouple whenever effective field theories work, pretty much by definition of what it means to have an effective theory. However this is not the decoupling that naturalness relies on. To quantify naturalness you move around between theories in an abstract theory space. This is very similar to moving around in the landscape of the multiverse. Indeed, it is probably not a coincidence that both ideas became popular around the same time, in the mid 1990s. So if you want to quantify how sensitively a theory at low energy depends on the choice of parameters at high energies, you first have to define the probability for making such choices. This means you need a probability distribution on theory space. It’s the exact same problem you also have for inflation and in the multiverse. Most papers on naturalness, have the probability distribution left unspecified which implicitly means one chooses a uniform distribution over an interval of about length 1. The typical justification for this is that once you factor out all dimensionful parameters, you should only have numbers of order 1 left. It is with this assumption that naturalness becomes meaningless because you have now simply postulated that numbers of order 1 are better than other numbers. You can read Gian's paper here: https://arxiv.org/pdf/1710.07663.pdf
Posted by KRA at 10:30