Tuesday, 20 May 2025

সেউজ শিল্প: জ্যোতি-সংগীতঃ অ’ মোৰ সপোন সেউজীয়া

সেউজ শিল্প: জ্যোতি-সংগীতঃ অ’ মোৰ সপোন সেউজীয়া:

GLOBAL WARMING: IS IT REALY HAPPENNING?



Dr. Bhuban Gogoi
INTRODUCTION:   
Global warming is thought to be the greatest exclusive event that occurred during this recent age of highest scientific development. It is also known as Anthropogenic Global Warming (AGW) event. IPCC demands that the event is recognised on strong consensus of world scientific community. It is highly supported by UNO and UNFCCC counterpart in whom most of the countries of the world are members. The whole of the developed countries along with the third world is supporting the event and taking steps through Kyoto Protocol to control CO2 emission that thought to be responsible for the problem. But against, a large number of scientists controvert over this so called consensus through Oregon Petition, Leipzig Declaration and other media (whose opinion have been highly suppressed and mostly sealed) who consider it as the greatest hoax and fraud of the age. They call it as great ‘climategate’ and conspiracy against the people of the world and a shining example of corruption of science. This paper seeks to review the basic foundation of the idea already demanded by IPCC as unequivocally accepted and to find its real picture in the light of point of view of the scientists against it.
Global warming is the rise of temperature of earth’s atmosphere and oceans since the late 19th century and project of continuation of it. The earth’s mean surface temperature is increasing by 0.80C (1.40F) since the early 20th century with about one third since 1980. The rising is thought mainly to be from rising of Green House Gases (GHG) mainly CO2 produced by increasing human activities in burning fossil fuel and deforestation. This finding is supported and recognized by all Academies of Science of most of the industrialized countries of the world. The fact of warming was appeared in IPCC’s Fourth Assessment Report(AR4) on Climate Change (fig.1) and Summary for Policy Makers(SPM) where it indicates that 21st century will witness a rise of earth’s mean temperature of about 1.1 to 2.90C (2.0 to 5.20F) at lowest CO2 emission level and  about 2.4 to 6.40C (4.3 to 11.50F) at maximum level. Future warming and related changes will vary from region to region around the globe. The effects of increase of global temperature include a rise in sea level and a change in the amount and pattern of precipitation, as well a probable expansion of subtropical deserts. Warming is expected to be strongest in the Arctic and would be associated with the continuing retreat of glaciers, permafrost and sea ice. Other likely effects of the warming include a more frequent occurrence of extreme-weather events including heat waves, droughts and heavy rainfall, ocean acidification and species extinctions due to shifting temperature regimes. Effects significant to humans include the threat to food security from decreasing crop yields and the loss of habitat from inundation and desertification.
Proposed policy responses to global warming include mitigation by emissions reduction, adaptation to its effects, and possible future geo-engineering. Most countries are parties to the United Nations Framework Convention on Climate Change (UNFCCC) whose ultimate objective is to prevent dangerous anthropogenic (i.e., human-induced) climate change. Parties to the UNFCCC have adopted a range of policies designed to reduce greenhouse gas emissions and to assist in adaptation to global warming. Parties to the UNFCCC have agreed that deep cuts in emissions are required and that future global warming should be limited to below 2.0 °C (3.6 °F) relative to the pre-industrial level. Reports published in 2011 by the United Nations Environment Programme and the International Energy Agency suggests that efforts of the early 21st century to reduce emission level may be inadequate to meet the UNFCCC's 2 °C target.
GLOBAL TEMPERATURE DATA – SOURCE AND METHODS:
The main global surface temperature data set are managed by the US National Oceanic and Atmospheric Administration (NOAA) at the National Climatic Data Center (NCDC). This is the Global Historical Climate Network (GHCN). Then,  there are surface temperature data of NASA-GISS and Hadley HadCRUT- all using same raw data-set but with different adjustment methods. Satellite data-set from 1979 onwards of RSS and UAH are available using common satellite data-set but with processing techniques different.  The period of record of earth’s surface temperature varies from station to station, with several thousand extending back to 1950 and several hundred being updated monthly. This is the
                       
                                       
         Fig-1- Increase of global temperature and sea level and decrease of Northern Hemisphere snow cover (Source: SPM of IPCC).
main source of data for global studies. However, the measurement of a “global” temperature is not as simple as it may seem. Historical instrumentally recorded temperatures exist only for 100 to 150 years in small areas of the world. During the 1950s to 1980s temperatures were measured in many more locations, but many stations are no longer active in the database. Satellite measurements of atmospheric temperature were begun in 1979.
Now let us see the methods of finding earth’s temperature data to get the trend of rising or falling. As per IPCC, average surface air temperatures are calculated at a given station based on the following procedure: record the minimum and maximum temperature for each day; calculate the average of the minimum and maximum. Calculate the monthly averages from the daily data. Calculate the annual averages by averaging the monthly data. Various adjustments are also made, so it is not actually simple. The IPCC uses data processed and adjusted by the UK-based Hadley Climatic Research Unit of the University of East Anglia (HadCRU), although much of the HadCRU data comes from the GHCN (Global Historical Climate Network) of US National and Oceanic Administration (NOAA) at the National Climatic Data Centre (NCDC) and NASA-GISS (Goddard Institute of Space Studies). The UK-based HadCRU provides the following description: 
 Over land regions of the world, over 3000 monthly station temperature time series are used. Coverage is denser over more populated parts of the world, particularly the United States, southern Canada, Europe and Japan. Coverage is sparsest over the interior of the South American and African continents and over the Antarctic. The number of available stations was small during the 1850s, but increases to over 3000 stations during the 1951-90 periods. For marine regions sea surface temperature (SST) measurements taken on board merchant and some naval vessels are used. As the majority come from the voluntary observing fleet, coverage is reduced away from the main shipping lanes and is minimal over the Southern Oceans.”
 Stations on land are at different elevations and different countries estimate average monthly temperatures using different methods and formulae. To avoid biases that could result from these problems, monthly average temperatures are reduced to anomalies from the period with best coverage (1961-90). For stations to be used, an estimate of the base period average must be calculated. Because many stations do not have complete records for the 1961-90 periods, several methods have been developed to estimate 1961-90 averages from neighbouring records or using other sources of data. Over the oceans, where observations are generally made from mobile platforms, it is impossible to assemble long series of actual temperatures for fixed points. However it is possible to interpolate historical data to create spatially complete reference climatologies (averages for 1961-90) so that individual observations can be compared with a local normal for the given day of the year.
 It is important to note that the HadCRU station data used by the IPCC is not publicly available – neither the raw data nor the adjusted data - only the adjusted gridded data (i.e. after adjustments are made and station anomalies are averaged for the 5x5 degree grid) are available. Temperature anomalies of stations of each 5×5 degree grid for each year are separately averaged to get grid’s yearly data after doing some adjustments and weighting before finding out station anomalies. Then all the grids anomalies of a hemisphere for each year are averaged to get hemispherical yearly anomaly data. Then the two hemispheres data are averaged to find out global single average annual temperature anomaly. By plotting these global annual temperature anomalies on graphs, the trend of temperature will come up.
 Different agencies use different methods for calculating a global average after adjustments being made to the temperatures. As already mentioned, in HadCRU method (used by IPCC), anomalies are calculated based on the average observed in the 1961 – 1990 period (thus stations without data for that period cannot be included). For the calculation of global averages, the HadCRU method divides the world into a series of 5 x 5 degree grids and the temperature is calculated for each grid cell by averaging the stations in it. The number of stations varies all over the world and in many grid cells there are no stations. Both the component parts (land and marine) are separately interpolated to the same 50 x 50 latitude/longitude grid boxes. Land temperature anomalies are in-filled where more than four of the surrounding eight 50 x 50 grid boxes are present. Weighting methods can vary but a common one is to average the grid-box temperature anomalies, with weighting according to the area of each 50 x 50 grid cell, into hemispheric values; the hemispheric averages are then averaged to create the global-average temperature anomaly. The IPCC deviates from the HadCRU method at this point – instead the IPCC uses “optimal averaging. This technique uses information on how temperatures at each location co-vary, to a given time.” Thus empty grid cells are interpolated from surrounding cells. Another method is to calculate averages by averaging the cells within latitude bands and then averaging all the latitude bands.
 In GISS “A grid of 8000 grid boxes of equal area is used. Time series are changed to series of anomalies. For each grid box, the stations within that grid box and also any station within 1200 km of the centre of that box are combined using the ‘reference station method’. A similar method is also used to find a series of anomalies for 80 regions consisting of 100 boxes from the series for those boxes, and again to find the series for 6 latitudinal zones from those regional series, and finally to find the hemispheric and global series from the zonal series.” These are the methods of finding the global temperature anomaly series plotting of which on graph will show the rising or falling trends.
DEFECTS IN METHODS OF DETERMINING GLOBAL WARMING:
 There are some genuinely untenable theoretical and technical grounds behind temperature determining methods applied by IPCC, HadCRU and GISS. The earth’s average surface temperature is not simply a mathematical calculation. This is not a single temperature of the earth as a whole that recorded from sky or elsewhere from space or high atmosphere. Satellite recording of earth’s temperature started from 1979 and from that time in satellite recorded data it is observed no sign of rising of temperature or global warming. So depending only on earth’s average surface temperature data determined only by mathematical calculating methods with wide range of theoretical and technical negative limitations, how can we unequivocally accept the IPCC’s decision on global warming? Limitations in data availability, required quantum of data, considerations in wide range of geographical, climatic and seasonal variability, limitations in methods of averaging and weighting, limitations in grid making or area concepts, locational factors of data acquisition, proxy in absence of data, adjustment of data on unknown basis, misrepresentation and interpretation of historical data, cartographic misrepresentation and scale question, data manipulation,  data mismanagement, data suppression,  limitations in theoretical and computer models of forecasting or prediction, limitations in interpretation and explanations, instrumental limitations, etc. a good number of genuine limitations are there which raise the question of tenability and unequivocal acceptability of the Global Warming theory. There is no counting of head or voting politics and method for acceptability in science. Let us categorically discuss in the following points.
1. Data availability is a major defect of the theory. Temperature recording stations of GISS are not evenly distributed over the world. The US has the highest and densest number of stations and world areas between 300N and 600N has the highest coverage (more than 69% of whose about half in US). Other areas have sparse number of station except some patches and the vast oceans are mostly devoid of recording stations (fig. 2). So how can the temperature measurement be global? The temperature for US, if warming is allowed will be more reliable than other parts of the globe or the globe as a whole. The following table-1 shows the distribution of stations by latitude band.                                    
                                                            Table-1
In addition to sparseness and distribution of stations (fig.2), the number of stations is also changing from time to time. The land coverage slowly increased from 10% of 1880s to 40% in   1960s and then continuously decreasing in the recent years (fig. 3).
                                                      
 Fig-2 World distribution of temperature recording stations of GISS (NASA)                
Fig-3  yearly no. of stations (in c shows % of hemisphere areas within 1200 km from a station equal to almost twice the area of 5x5 degree grid box of HadCRU). Source: NASA-GISS [http://data.giss.nasa.gov/gistemp/station_data/].
 The following figure shows the variation in the number of stations (a) in the GHCN from 1850 to 1997 and the variation in the global coverage of the stations as defined by 5° ´ 5° grid boxes (b). There was a major disappearance of recording stations in the late 1980’s and early 1990’s. 
            
                                   Fig.4- Existence of recording stations in time-scale
The following fig.5 compares the number of global stations in 1900, 1970s and 1997 showing the increase and then decrease. The 1997 figure below shows the station coverage of stations that can be used as of 1997 – i.e. the blank areas do not have coverage in recent times.
                                           
                                                                              
                               
            Fig.5- Comparison of Available GHCN Temperature Stations over Time.
 The University of Delaware has an animated movie of station locations over time. In addition, many stations move locations and some stop collecting data during periods of war (or for example, during the Cultural Revolution in China) – leading a major problem of discontinuities.
The following figure shows the number of stations in the GHCN database with data for selected years, showing the number of stations in the United States (light) and in the rest of the world (ROW – dark). The percents indicate the percent of the total number of stations that are in the U.S.
       
                             





Fig.6- Comparison of Number of GHCN Temperature Stations in the US versus Rest of the World.
 The following fig.7 shows a calculation of straight temperature averages for all of the reporting stations for 1950 to 2000. While a straight average is not meaningful for global temperature calculation (since areas with more stations would have higher weighting), it illustrates that the disappearance of so many stations may have introduced an upward temperature bias. As can be seen in the figure, the straight average of all global stations does not fluctuate much until 1990, at which point the average temperature jumps up. This observational bias can influence the calculation of area-weighted averages to some extent.  A study by Willmott, Robeson and Feddema ("Influence of Spatially Variable Instrument Networks on Climatic Averages, Geophysical Research Letters vol 18 No. 12, pp2249-2251, Dec 1991) calculated a +0.2C bias in the global average due to pre-1990 station closures.
                       
Fig.7-Calculation of Average Temperatures from Reporting Stations for 1950 to 2000.
 Cartographic scale used gives some wrong impression and also some misrepresentations there. Vertical scales are much greater than horizontal scale which gives wrong impression to data about vertical distribution showing much larger image of slightest variation and change. If horizontal and vertical scales are same or nearer, then the impression is almost found correct. Again some dissimilar vertical scales are seen used for comparative purposes, then that case will not serve the purpose. All know these, so it is just an intentional.
Temperature measurement stations must be continually re-evaluated for suitability for inclusion due to changes in the local environment, such as increased urbanization, which causes locally increase of temperatures regardless of the external environmental influences. Thus only rural stations can be validly used in calculating temperature trends. But our stations are mostly urban. As a result, adjustments are made to temperature data at urban stations as discussed later paras.
 There is substantial debate in the scientific community regarding the use of various specific stations as well as regarding the factors that can affect the uncertainty involved in the measurements. For example, the Pielke et al paper available is a recent (Feb 2007) publication by 12 authors describing the temperature measurement uncertainties that have not been taken into sufficient consideration.
 The Surface Stations web site is accumulating physical site data for the temperature measurement stations (including photographs) and identifying problem stations -- there are a significant number of stations with improper site characteristics, especially in urban areas.
2. Geographical, altitudinal, latitudinal, climatic and seasonal variations become meaningless and in some cases immaterial when earth’s only one yearly generalised average surface temperature or temperature anomaly is considered. Is it justifiable? The earth is a sphere warming through the sunlight only from one side mostly fixed perpendicularly over low latitudes, will occur latitudinal strict variations. In this cas latitudinal average is meaningful and may be allowed but among latitudes with wide range of temperature variability is meaningless and can never be allowed. Again there are altitudinal variations i.e. high altitudes low temperature like high latitudes low temperature. These cannot be averaged or generalised except very limited local cases. Another is the climatic variability and contrasts like dry – humid, desert – polar/alpine, continental – coastal/oceanic, etc. cannot be bound together for a generalised average value of temperature. Again the seasonal variability of temperature of a station cannot be averaged into only a generalised yearly single average value which will have at all no meaning. Surface and time (yearly seasonal cycle) factors have made wide range of variations in case of the earth and considering in generalized single value, it will be meaningless and unscientific.
3. Limitations in applied temperature determining scientific methods is another drawback of the theory.  The NASA-GISS processes the GHCN data through a series of data adjustments to calculate global temperatures using a different method than HadCRU. The following figures compare the GISS 2005 temperature anomalies with the HadCRU – HadCRUT data. GISS uses a 1200-km smoothing which creates an artificial expansion of data into areas without data. Thus the Arctic shows more (artificial) warming in the GISS data than in the HadCRUT. Grey areas  are areas without data.                                                                               
Fig.8-  Artificial warming of Arctic area due to adjustment by proxy by GISS (smoothing) and HadCRU (?).
 4. Complete absence of recording stations and data in large number of land grids of HadCRU or GISS or required length of time series including the base 1961-1990 in stations of most number of grids and especially the oceanic regions are mostly in absence of data except some very ingenuine and irregular data taken from shuttling ships, are main drawbacks. Interestingly these blank areas are filled up through proxy data using different methods by different agencies. These are already discussed in ‘global temperature data-source and method’ point. HadCRU, GISS and IPCC use different methods to fill up blank temperature anomaly grids by proxy with certain technique. HadCRU uses ‘interpolation method’ from four grid values out of eight surrounding a grid if available, for both component part of land and ocean areas separately. IPCC uses ‘optimal averaging method’ which uses informations on how temperature in each location co-varies to interpolate the data. GISS uses ‘reference station method’ which combines the stations within the grid and stations 1200 km extension around the centre of that grid for interpolation. Thus warming is made unequivocally(?) acceptable to all through filling up data in blank grid boxes at large scale by theoretical proxy ironically.
5. Adjustment of data is another drawback of Global Warming, the technique of which is not disclosed by IPCC till date. The adjustments made are in many fields. GISS’s 1200km smoothing is an adjustment to cover up areas where there is no data as mentioned in point four above which creates artificial warming in areas like the Arctic region. Adjustments are also made in  historical temperature data-set on unknown basis. Area edit (then called Raw), time of observations (TOBS), equipments change (MMTS), station history adjustment (SHAP), fill missing data (FILNET), urban warming adjustment (urban), etc. are the fields the adjustments made on GHCN data. But the adjustments find no logical solid ground. The following examples published in www.apinsys.com/GlobalWarming will give idea of the secrecy of adjustment.
             Fig.9- The Northern Hemisphere average temperature  from National Geographic – November, 1976.      Fig.10 Northern Hemisphere average temperature from the Met Office Hadley Centre.
   
        Fig.11- Combination (superimposition) of the above two figures. A warming   revision occurred in 1958 in HadCRU study.
The temperature data recorded from the stations is not simply used in the averaging calculations: it is first adjusted. Different agencies use different adjustment methods. The station data is adjusted for e.g. homogeneity (i.e. nearby stations are compared and adjusted if trends are different. As an illustration of the sometimes questionable effects of temperature adjustments, consider the United States data (almost 30 percent of the world’s total historical climate stations are in the US; rising to 50 % of the world’s stations for the post-1990 period). The following graphs show the historical US data from the GISS database as published in 1999 and 2001. The graph on the left was produced in 1999 (Hansen et al 1999) and the graph on the right was produced in 2000(Hansenetal2001)[http://pubs.giss.nasa.gov/docs/2001/2001 Hansen_etal.pdf). They are from the same raw data – the only difference is that the adjustment method was changed by NASA in 2000.
Fig.12-U.S. Temperature Changes Due to Change in Adjustment Methods (Left: 1999, Right 2001)
 The following figure compares the above two graphs, showing how an increase in temperature trend was achieved simply by changing the method of adjusting the data. Some of the major changes are highlighted in this figure – the decreases in the 1930s and the increases in the 1980s and 1990s.                                                                 Fig. 13 -Comparison of U.S. Temperature Changes Due to Change in Adjustment Methods. Source: given below  and see for more information on Hansen’s data manipulations. (http://www.appinsys.com/globalwarming/Hansen_GlobalTemp.htm)
 There are many examples to prove that data adjustment is made only to set a rising trend of temperature of particular grid opposed to the real normal or declining trend of temperature. It is thought and accepted to be an apriori that if there is a equilibrium or declining trend, it is wrong and tried to adjusted to get a rising correct trend. It is also observed that most of the recording stations are located in urban or nearby urban centres which always shows a rising trend are adjusted to get rural type specially in the Northern hemisphere and here also the trend is kept. Example may be cited as in the following.
 Temperature station adjustments are theoretically supposed to make the data more realistic for identifying temperature trends. In some cases the adjustments make sense, in other cases – not. Temperature adjustments are often made to U.S. stations that do not make sense, but invariably increase the apparent warming. The following figure shows the closest rural station to San Francisco (Davis - left) and closest rural station to Seattle (Snoqualmie – right). In both cases warming is artificially introduced to rural stations. (See: http://www.appinsys.com/GlobalWarming/GW_Part3_UrbanHeat.htm for details on the Urban Heat Island effects evident in the surface data)                           
Fig. 14- Artificial Warming Trends in Adjustments to U.S. Rural Stations (upper real and lower graphs adjustment).
 Here is an example where the adjustment makes sense. In Australia the raw data for Melbourne show warming, while the nearest rural station does not. The following figure compares the raw data (lower graph) and adjusted data (upper graph)for Melbourne. However, this seems to be a rare instance.

                   
       Fig. 15- Comparison of Adjusted and Unadjusted Temperature Data for    Melbourne, Australia.
 The following graph is more typical of the standard adjustments made to the temperature data – this is for Darwin, Australia (upper – unadjusted, lower – adjusted). Warming is created in the data through the adjustments.
                
Fig. 16- Comparison of Adjusted and Unadjusted Temperature Data for Darwin, Australia

The following figures show a more recent example of the GISS re-adjustment of data (from: Bob Tisdale at http://i44.tinypic.com/29dwsj7.gif). The 2000 and 2009 versions of the GISTEMP data are compared. This shows the additional artificial warming trend created through data adjustment.
          fig.17- the 2000 (right) and 2009 (left) version of GISTEMP data.      
 
  Fig. 17- superimposition of GISTEMP shows artificial trend out of adjustment.
Since 2000, NASA has further “cleaned” the historical record. The following graph shows the further warming adjustments made to the data in 2005. The data can be downloaded at http://data.giss.nasa.gov/gistemp/graphs/US_USHCN.2005vs1999.txt   the following graph is from http://www.theregister.co.uk/2008/06/05/goddard_nasa_thermometer/print.html.
This figure plots the difference between the 2000 adjusted data and the 2005 adjusted data. Although the 2000 to 2005 adjustment differences are not as large as the 1999 to 2000 adjustment differences shown above, they add additional warming to the trend throughout the historical record.
              
      Fig. 18- differences between adjusted data of temperature of 2000 & 2005.
  After adjustments, the urban stations exhibit warming. The following figures compare the adjusted (lower light graph) with the unadjusted data (upper left graph) for Wellington (left) and Christchurch (right). These adjustments introduce a warming trend into urban data that show no warming in original.         
 Fig.19- Comparison of Adjusted and Unadjusted Temperature Data for Wellington (left) and Christchurch (right).
 Even  Auckland ( below left) and Hokitika ( right), listed as rural, ends up with a very significant warming trend.
     
Fig.20- Comparison of Adjusted and Unadjusted Temperature Data for Auckland and Hokitika
 Adjustments to the data show how all of New Zealand (which exhibits no warming) ends up contributing to “global warming” – the graphs following show unadjusted (left) and adjusted (right) for Auckland, Wellington, Hokitika and Christchurch.

   
Fig. 21- Comparison of Adjusted and Unadjusted Raw Temperature Data
Thus, there are many extra problems with IPCC (HadCRU) methods while HadCRU / IPCC uses an interpolation method for 5x5 degree grids that have no stations. Siberia provides an example of the flaws involved. The following figure shows 5 x 5 degree grids with interpolated data as used by the IPCC, showing the temperature change from 1976 to 1999. Some Siberian 5x5 grids are highlighted in the upper-right rectangular box in the following figure. These are the area of 65 – 80 latitude x 100 -135 longitudes. This illustrates the effect of selecting a particular start year and why the IPCC selected 1976.
                
Fig.22- IPCC Warming from 1976 to 1999 in 5x5 degree grid cells [from Figure 2.9 in the IPCC Third Annual Report (TAR)].
 The NOAA National Climatic Data Center (NCDC) has an animated series showing the temperature anomalies for July of each year from 1880 to 1998 (no interpolation into empty grids). Source : http://www.ncdc.noaa.gov/img/climate/research/ghcn/movie_meant_latestmonth.gif. The images in the following figure are from that series. These give an indication of the global coverage of the grid temperatures and how the coverage has changed over the years, as well as highlighting 2 warm and 2 cool years. The 1930’s were a very warm period (compare 1936 in b with 1998 in d below).

                    Fig.23-Temperature Anomalies for 5x5-Degree Grids for Selected Years    (GHCN data)
 In the GHCN data shown above, grid boxes with no data are left empty. In the IPCC method, many empty grid boxes are filled in with interpolations, with the net effect of increasing the warming trend. The following figure shows temperature trends for the Siberian area highlighted previously (Lat 65 to 80 - Long 100 to 135). Of the eight main temperature dots on the IPCC map, three are interpolated (no data). Of the five with data, the number of stations is indicated in the lower left corner of each grid-based temperature graph. The only grid with more than two stations shows no warming over the available data. The average for the entire 15 x 35 degree area is shown in the upper right of the figure. Of the eight individual stations, only two exhibit any warming since the 1930’s (the one in long 130-135 and only one of the two in long 110-115). An important issue is to keep in mind is that in the calculation of global average temperatures, the interpolated grid boxes are averaged in with the ones that actually have data. This example shows how sparse and varying data can contribute to an average that is not necessarily representative.
                 Fig.24-Temperatures for 5x5 Grids in Lat. 65 to 80 – Long. 100 to 135 Satellite Data
6. Satellite Data - Satellites have more recently been used to remotely sense the temperature of the atmosphere, starting in 1979. The following figure shows the satellite data for Jan 1979 – Jan 2008 (left) and for Jan 2001 – Jan 2008(right).[http://www.worldclimatereport.com/index.php/2008/02/07/more-satellite-musings/#more-306]. A consistent warming trend is not displayed, as it should be if CO2 were playing its hypothesized role. Satellite data is of somewhat limited use due to its lack of long-term historical data – many locations show warm periods in the 1930s-40s, but satellite data starts in 1979.                  
                                                    fig. 25

The following figure shows the global satellite temperature anomalies since satellite data first started to become available in 1979. From 1979 to 1997 there was no warming. Following the major El Nino in 1997-98, there was a residual warming and since then, no warming. All of the warming occurred in a single year.

   
Fig. 26-source: http://www.appinsys.com/GlobalWarming/SatelliteTemps.htm

 Global warming is not global. What is the meaning of a global average temperature? Global warming is not uniform on the globe and has a distinct North / South variance with cooling in the Southern Hemisphere There are also major differences in regions within the hemispheres.
 A recent paper by Syun-Ichi Akasofu at the International Arctic Research Center (University of Alaska Fairbanks) provides an analysis of warming trends intheArctic.[http://www.iarc.uaf.edu/highlights/2007/akasofu_3_07/index.php ] It is interesting to note from the original paper from Jones (1987, 1994) that the first temperature change from 1910 to 1975 occurred only in the Northern Hemisphere. Further, it occurred in high latitudes above 50° in latitude (Serreze and Francis, 2006). The present rise after 1975 is also confined to the Northern Hemisphere, and is not apparent in the Southern Hemisphere; … the Antarctic shows a cooling trend during 1986-2005 (Hansen, 2006). Thus, it is not accurate to claim that the two changes are a truly global phenomenon”. The following figure shows satellite temperature anomaly data for the three world regions of Northern Hemisphere, Tropics and Southern Hemisphere - warming has only been occurring in the Northern Hemisphere.
          fig. 27- Two hemispherical and tropical temperature trends. Source:  http://www.appinsys.com/GlobalWarming/GW_NotGlobal.htm for details on this.
Thus many examples may be cited from many scientific works available in internet and research works.. On this back ground the global warming of IPCC cannot be valid. Proxy, manipulation, adjustment, etc. make data unreal and based on that data,  the works will be unreal.
Manipulation in historical data of temperature cited from vostok ice sheet and tree ring proxy are also made. Again there was a strong secret suppression of  scientists’ data, findings, and opinion and denial of peered review from IPCC which was discovered while the hacking of HadCRU server occurred by somebody for two times-  in 2009 and in 2011. Today in 2013 also, the IPCC 5th assessment report (draft) is also leaked where it is found that they are going to revise their game-plan and going to give more stress in solar forcing in temperature rising and to give less stress in climatic disturbances.
7. Justification  of the Proposed Effect of Global Warming of IPCC :
IPCC shows many current and long standing effects of climate change due to global warming. Sea level change, melting of ice and retreat of glaciers, climatic disturbances, etc. are main. Let us see whether these are correct or not.
Sea level change:   IPCC in global warming try to alarm the public with threatening images of melting glaciers, huge chunks of ice breaking off the Antarctic and Greenland ice shelves, and rising ocean levels. But the predicted rise in ocean levels is trivial compared to the 400 feet they have risen in the last 18,000 years without any help from burning of fossil fuels or any other human contribution to CO2.
Melting of ice and glaciers: Nautical records show ice shelves have been breaking off for centuries, long before the rise in atmospheric CO2 or the world became industrialized. And polar ice is not disappearing. The West Antarctic Ice Sheet lost two-thirds of its ice mass since the last ice age but is now growing. Side-looking interferometry shows it is now growing at a rate of 26 billion tons a year. How can this be when there are pictures of huge ice chunks breaking off and melting? While ice is disappearing at the perimeter, it is piling up inland. Most of the Antarctic ice is above 4,000 feet. As the ice increases there, it pushes the glaciers toward lower elevations at the edge of the continent, where they break off. The only part of Antarctica that is warming is the peninsula, which is furthest from the South Pole and comprises only 2 percent of Antarctica—but it is the part the news media focuses on when they talk about global warming in Antarctica. They never mention the other 98 percent that is getting colder, as can be seen from the measurements of the British meteorological stations there, which can easily be found on the internet.

At the top of the globe, the western Arctic is warming due to unrelated cyclical events in the Pacific Ocean while the eastern Arctic and Greenland are getting colder. According to a letter from Myron Ebell (quoted in TWTW of Feb. 3, 2007 at
http://sepp.org/), the chairman of the Arctic Climate Impact Assessment “redefined the Arctic in order to show a bigger warming trend and cut off the temperature record before 1950 so that they wouldn't have to explain why it was at least as warm in the 1930s as today in the Arctic (the reason claimed is a hoot: there weren't enough weather stations before 1950—even though there were more than in recent decades).” The recent proposal to list the polar bear as “threatened” mentions areas of open water in the Arctic that were frozen solid 30 years ago. But these same areas were reported as open water by explorers in the early 20th century. These areas subsequently froze during several decades and have now merely returned to their previous condition. The Greenland ice mass has thickened by seven feet since it was first measured by laser altimetry in 1980 and continues to grow.
What about the glaciers that are melting? For some glaciers around the world, historical records exist of their lengths over centuries. An intriguing study is “Extracting a Climate Signal from 169 Glacier Records” by J. Oerlemans (
Science, April 29, 2005). As you can see in this chart from his work, glaciers have been receding since 1750, with the trend accelerating after about 1820. Henry Ford began assembly line production in 1913, but by then half of the glacier loss from 1800 to 2000 had already occurred. And 70 percent of the glacier shortening occurred before 1940, that is, before worldwide industrialization and the increase in atmospheric CO2 that we are told is causing the glaciers to melt down to today’s level as fig. 28 follows.
                               fig. 28- length of glaciers through time
Though glaciers have been receding worldwide, they have not retreated back to their locations in the Medieval Warm Period. The Aletsch and Grindelwald glaciers (Switzerland) were much smaller between 800 and 1000 AD than today. The latter glacier is still larger than it was in 1588 and earlier years. In Iceland today, the Drangajokull and Vatnajokull glaciers are far more extensive than in the Middle Ages, and farms remain buried beneath their ice. This is so called “unprecedented” global warming, the “greatest threat to mankind,” or the disappearance of glaciers being “the worst in thousands of years” because of increases in carbon dioxide in recent decades.
Climatic disturbances: Violent weather of IPCC is not so getting worse. Climate alarmists claim the global warming may increase severe weather events. There is absolutely no evidence of increasing severe storm events in the real world data. The Accumulated Cyclone Energy (ACE) is the combination of a storm's intensity and longevity. Global hurricane activity has continued to sink to levels not seen since the 1970s. During the past 60 years Northern Hemisphere ACE undergoes significant inter-annual variability but exhibits no significant statistical trend. The northern hemisphere 2008 ACE was 66% of the 2005 ACE as shown in the stacked bar chart.
Causes of global warming : 1. CO2 especially Anthropogenic CO2 of Green House Gases (GHG) is being mainly  held responsible for global warming. CO2 comprises only 0.035 percent of our atmosphere and is a very weak greenhouse gas. Although it is widely blamed for greenhouse warming, it is not

                            Northern Hemisphere Hurricane Activity (ACE)

                                                Fig.29
the only greenhouse gas or even the most important. Water vapor is by far the most important greenhouse gas, accounting for 97 or 98 percent of any greenhouse effect. The remainder is due to carbon dioxide, methane, and several other gases. Furthermore, of the tiny percentage that CO2 contributes to the greenhouse effect, 97 percent of that is due to nature, not man. Termites, for example, produce CO2 emissions many times that of all the factories and automobiles in the world. ( Science, Nov. 5, 1982.) Combining the factors of water vapor and nature's production of CO2, we see that 99.9 percent of any greenhouse effect has nothing to do with carbon dioxide emissions from human activity. So how much effect could regulating the tiny remainder have upon world climate? Then, too, keep in mind that: (1) anthropogenic emissions of CO2 are only one percent of the atmospheric reservoir of CO2; (2) they are an even smaller percentage of the reservoir of 40,000 billion tons of carbon in the oceans, dissolved as CO2 and in other forms; (3) the oceans receive large quantities of CO2 from volcanic emissions bubbling up from the ocean floors, most significantly from the Mid-Atlantic Ridge; and (4) the oceans are by far the dominant source of atmospheric CO2, with the equatorial Pacific alone contributing 72 percent of atmospheric CO2. Now, is it credible that these vast, global processes of nature can be altered by mankind's puny emissions of CO2? The global processes are so colossal as to overwhelm any human contribution. Furthermore, as Dr. Arthur B. Robinson has explained, “The turnover rate of carbon dioxide as measured by carbon 14 is too short to support a human cause [for a rise in CO2].” And Antarctic ice cores show increases in carbon dioxide follow increases in temperature—not the other way around. One can’t have a cause-and-effect relationship where the effect precedes the cause but that is the real case in this relationship (fig.33).

The earth's temperature has risen about 1 degree F. in the past century. This is not “global warming” but normal fluctuation. The climate is always changing, and one would be hard pressed to find a century when the change did not amount to a degree or more in either direction. But temperature changes within the past century do not correlate with CO2 emissions. Most of the one degree temperature rise of the past century occurred before 1940, while 82 percent of the CO2 entered the atmosphere after 1940. From 1940 until 1975, carbon dioxide was strongly increasing but global temperatures cooled, leading to countless scare stories in the media about a new ice age commencing.
 The graph following shows the temperature changes of the lower troposphere from the surface up to about 8 km as determined from the average of two analyses of satellite data. The UAH analysis is from the University of Alabama in Huntsville and the RSS analysis is from Remote Sensing Systems. The two analyses use different methods to adjust for factors such as orbital decay and inter-satellite difference. The best fit line from January 2002 indicates a declining trend. Surface temperature data is contaminated by the effects of urban development. The Sun's activity, which was increasing through most of the 20th century, has recently become quiet, causing a change of trend. The ripple line shows the CO2 concentration in the atmosphere, as measured at Mauna Loa, Hawaii. The ripple effect in the CO2 curve is due to the seasonal               
      
                                                              Fig. 30
changes in biomass. There is a far greater land area in the northern hemisphere than the south that is affected by seasons. During the Northern hemisphere summer there is a large uptake of CO2 from plants growing causing a drop in the atmospheric CO2 concentration. Cool periods in 1984 and 1992 were caused by the El Chichon and Pinatubo volcanic eruptions. The temperature spike in 1998 was cause by a strong El Nino. Natural climate change is much stronger than any effect from carbon dioxide.

February 2010 satellite data- global warming is not global:
February 2010 was reported to have the warmest global average. February anomaly since satellite data began in 1979, as shown in the following figure from http://www.drroyspencer.com/latest-global-temperatures/.

                                                              Fig.31
As shown in the following figure, most of the world had normal (white & light ) or below normal (light dark) temperatures. The global average is affected by one small area of the Arctic having much higher than normal temperature                                  
                                              Fig. 32
Correlations with the broader historical record are even more out of whack. During the Late Ordovician Period of the Paleozoic Era, CO2 levels were 12 times higher than today. According to greenhouse theory, the earth should have been really hot—but instead it was in an Ice Age! And now we are told that the planet will overheat if CO2 doubles. Carbon di oxide is a weak greenhouse gas. Computer simulations predicting environmental catastrophe depend on the small amount of warming from CO2 is being amplified by increased evaporation of water. Water vapor is a strong greenhouse gas. But in  many documented periods of higher CO2, even during much warmer temperatures, no such catastrophic amplification occurred. So there is no reason to fear the computer predictions—or base public policies on them. They are clearly wrong.
 From Antarctic Vostok ice core records as evidence that CO2 causes climate change, it is found that the cause and effect reversed. The record actually shows that the CO2 increase lagged the warming by about 800 years. Temperature increases cause the oceans to expel CO2, increasing the CO2 content of the atmosphere.
        
     Fig.33- The ice core data proves that CO2 is not a primary climate driver.
Since the greenhouse gas theory cannot explain global temperature changes, what does? The sun with cosmic rays from beyond our solar system also contributes. Everyone knows the sun heats the earth, but that heat is not uniform. “Sunspot” cycles vary solar intensity. These correlate extremely well with shorter term cycles in global temperatures. Dr. Baliunas of the Harvard-Smithsonian Center for Astrophysics has extended the correlation back another hundred years by using the sun's magnetic cycle as a proxy for its brightness (irradiation.) Longer and more severe swings in global climate, such as the ice ages, correlate with changes in the earth's orbit around the sun. Clouds have a hundred times stronger effect on climate than does carbon dioxide. A one percent increase in cloud cover would offset a doubling of atmospheric CO2. Yet from 1988 to 1990, cloud cover increased by 3 percent. And what determines cloud cover? The sun, through variations in cosmic rays and solar wind. In the words of Dr. Theodor Landscheidt of Canada's Schroeder Institute, “When the solar wind is strong and cosmic rays are weak, the global cloud cover shrinks. It extends when cosmic rays are strong because the solar wind is weak. This effect [is] attributed to cloud seeding by ionized secondary particles.”


                                                                                                                 
                                                                fig.34
Mars is undergoing global warming. Clearly this cannot be explained by the popular CO2 answer. What could possibly be causing Martian warming if not the sun? And if that's what is happening on Mars, why not on earth? After all, we share the same sun. Why is there such stubborn adherence to the CO2 hypothesis despite its failures? In June 2006 the journal Nature published three separate papers on an expedition that extracted sediment samples just 50 miles from the North Pole. Based in part on specimens of algae that indicate subtropical or tropical conditions, the scientists determined that 55 million years ago the Arctic Ocean had a balmy Florida-like year-round average temperature of 74 degrees F. Warming of this magnitude could not have been produced by carbon dioxide, but scientists cling tenaciously to the popular but failed explanation. An article in the New York Times describes this Arctic discovery as offering insight into “the power of greenhouse gases to warm the earth.” It quotes several scientists as saying they still support the idea of greenhouse gases determining the planet's warming or cooling, even though they admit they don't understand what happened here.

8. Politics behind global warming: Why there is a reluctance to admit the sun is responsible for changes in global climate when there is such strong evidence is not known. Why is there such emphasis on CO2 when the human contribution of it is trivial and water vapor is so much more important in greenhouse effect? Same answer to both questions: governments can only control people, not nature. If the sun is responsible for climate change, then there is nothing governments can do about it. If water vapor is the key to greenhouse warming, then there is nothing governments can do about it. For government to be relevant on this issue, it must have a cause that can be blamed on people, because people are the only thing government can control. And if government is not relevant on this issue, then there is no need for those political appointees from 150 nations to the IPCC. Nor is there a need for all the government grants to all the scientists and institutions for studies that keep trying to prove that increases in CO2 are causing global warming, in order to validate government intervention. Nor is there a justification for spending other people's money (taxpayer funds) for such purposes. Nor is there a need for the bureaucrats and governmental framework to study, formulate and implement regulations for controlling CO2 emissions, for extending the role of government over every aspect of people's lives. H.L. Mencken once said, “The urge to save humanity is almost always a false front for the urge to rule it.”
There is a world intergovermental (International?) politics behind global warming and climate change. It is the effect of cold war against socialist world waged by the capitalist and imperialist world. It was started as action against working class of England and then extended to general attack on working people and toiling masses of the world. The inception of the idea is very interesting. In the United States, the mass media devoted little coverage to global warming until the drought of 1988, and James E. Hansen's testimony to the Senate, which explicitly attributed "the abnormally hot weather plaguing our nation" to global warming. The British press also changed its coverage at the end of 1988, following a speech by Margaret Thatcher to the Royal Society advocating action against human-induced climate change. According to Anabela Carvalho, an academic analyst, Thatcher's "appropriation" of the risks of climate change to promote nuclear power, in the context of the dismantling of the coal industry following the uncertain effect of 1984-1985 miners' strike was one reason for the change in public discourse. At the same time environmental organizations and the political opposition were demanding "solutions that contrasted with the government's". Many European countries took action to reduce greenhouse gas emissions before 1990. West Germany started to take action after the Green Party took seats in Parliament in the 1980s. All countries of the European Union ratified the 1997 Kyoto Protocol. Substantial activity by NGOs took place as well. Both "global warming" and the more politically neutral "climate change" were listed by the Global Language Monitor as political buzzwords or catchphrases in 2005. In Europe, the notion of human influence on climate gained wide acceptance more rapidly than in the United States and other countries. A 2009 survey found that Europeans rated climate change as the second most serious problem facing the world, between "poverty, the lack of food and drinking water" and "a major global economic downturn". Eighty-seven per cent of Europeans considered climate change to be a very serious or serious problem, while ten per cent did not consider it a serious problem.
The United States supports global warming but denies accepting the Kyoto protocol and carbon budgeting because they apprehend a great loss to US development in industries while accepting the protocol. The govt. and non-govt. agencies of US like GISS are working with IPCC and also they are helping and backing by funding NGOs and helps to promote Govt. executive and judiciary policies of different countries of the world for protection of environment and also in the works UNO for this type of activities. The EEU and England, the UNO all are taking side of global warming pressing to turn the world politics towards environmental protection to make it the main political agenda amidst longstanding and ever deepening world economic recession and depression for which the world toiling masses are suffering greatly. The countries of the world are also taking part due to GATT, emerging of MNCs, TNCs and attempt to come over their own deepening economic crisis through not solving it, but diverting issues of development to environment in the name of safe of mankind. Citizen bodies, scientists and environment related NGOs are also in the same line. They are adopting all policies based on environmental determinism and environmentalism. But it is to be remembered that the environmentalism is   “-----------------anti-scientific reactionary trend-----------. The reason for its tenacity of life is the tendency of bourgeois author to appeal to geographical environment or biological laws for proof of the suitability or unsuitability ( in accordance with the interests of the bourgeoisie at a given moment) of historically complex social relation”—Bryeterman.
Authors note and acknowledgements: Most of the information, write ups, graphs, citations, etc. are directly taken from different web-sites of IPCC, GISS, GHCN, etc. and scientists writings and added directly or indirectly. The author seeks apology for use of them and acknowledges to different websites, authors, scientists, etc. that could not be mentioned here properly due to problem of space.
                                                ------------------------------------------

GLOBAL  WARMING:  IS  IT  REALY  HAPPENNING?
Dr. Bhuban Gogoi
INTRODUCTION:   
Global warming is thought to be the greatest exclusive event that occurred during this recent age of highest scientific development. It is also known as Anthropogenic Global Warming (AGW) event. IPCC demands that the event is recognised on strong consensus of world scientific community. It is highly supported by UNO and UNFCCC counterpart in whom most of the countries of the world are members. The whole of the developed countries along with the third world is supporting the event and taking steps through Kyoto Protocol to control CO2 emission that thought to be responsible for the problem. But against, a large number of scientists controvert over this so called consensus through Oregon Petition, Leipzig Declaration and other media (whose opinion have been highly suppressed and mostly sealed) who consider it as the greatest hoax and fraud of the age. They call it as great ‘climategate’ and conspiracy against the people of the world and a shining example of corruption of science. This paper seeks to review the basic foundation of the idea already demanded by IPCC as unequivocally accepted and to find its real picture in the light of point of view of the scientists against it.
Global warming is the rise of temperature of earth’s atmosphere and oceans since the late 19th century and project of continuation of it. The earth’s mean surface temperature is increasing by 0.80C (1.40F) since the early 20th century with about one third since 1980. The rising is thought mainly to be from rising of Green House Gases (GHG) mainly CO2 produced by increasing human activities in burning fossil fuel and deforestation. This finding is supported and recognized by all Academies of Science of most of the industrialized countries of the world. The fact of warming was appeared in IPCC’s Fourth Assessment Report(AR4) on Climate Change (fig.1) and Summary for Policy Makers(SPM) where it indicates that 21st century will witness a rise of earth’s mean temperature of about 1.1 to 2.90C (2.0 to 5.20F) at lowest CO2 emission level and  about 2.4 to 6.40C (4.3 to 11.50F) at maximum level. Future warming and related changes will vary from region to region around the globe. The effects of increase of global temperature include a rise in sea level and a change in the amount and pattern of precipitation, as well a probable expansion of subtropical deserts. Warming is expected to be strongest in the Arctic and would be associated with the continuing retreat of glaciers, permafrost and sea ice. Other likely effects of the warming include a more frequent occurrence of extreme-weather events including heat waves, droughts and heavy rainfall, ocean acidification and species extinctions due to shifting temperature regimes. Effects significant to humans include the threat to food security from decreasing crop yields and the loss of habitat from inundation and desertification.
Proposed policy responses to global warming include mitigation by emissions reduction, adaptation to its effects, and possible future geo-engineering. Most countries are parties to the United Nations Framework Convention on Climate Change (UNFCCC) whose ultimate objective is to prevent dangerous anthropogenic (i.e., human-induced) climate change. Parties to the UNFCCC have adopted a range of policies designed to reduce greenhouse gas emissions and to assist in adaptation to global warming. Parties to the UNFCCC have agreed that deep cuts in emissions are required and that future global warming should be limited to below 2.0 °C (3.6 °F) relative to the pre-industrial level. Reports published in 2011 by the United Nations Environment Programme and the International Energy Agency suggests that efforts of the early 21st century to reduce emission level may be inadequate to meet the UNFCCC's 2 °C target.
GLOBAL TEMPERATURE DATA – SOURCE AND METHODS:
The main global surface temperature data set are managed by the US National Oceanic and Atmospheric Administration (NOAA) at the National Climatic Data Center (NCDC). This is the Global Historical Climate Network (GHCN). Then,  there are surface temperature data of NASA-GISS and Hadley HadCRUT- all using same raw data-set but with different adjustment methods. Satellite data-set from 1979 onwards of RSS and UAH are available using common satellite data-set but with processing techniques different.  The period of record of earth’s surface temperature varies from station to station, with several thousand extending back to 1950 and several hundred being updated monthly. This is the
                       









     Fig-1- Increase of global temperature and sea level and decrease of            Northern Hemisphere snow cover (Source: SPM of IPCC).
main source of data for global studies. However, the measurement of a “global” temperature is not as simple as it may seem. Historical instrumentally recorded temperatures exist only for 100 to 150 years in small areas of the world. During the 1950s to 1980s temperatures were measured in many more locations, but many stations are no longer active in the database. Satellite measurements of atmospheric temperature were begun in 1979.
Now let us see the methods of finding earth’s temperature data to get the trend of rising or falling. As per IPCC, average surface air temperatures are calculated at a given station based on the following procedure: record the minimum and maximum temperature for each day; calculate the average of the minimum and maximum. Calculate the monthly averages from the daily data. Calculate the annual averages by averaging the monthly data. Various adjustments are also made, so it is not actually simple. The IPCC uses data processed and adjusted by the UK-based Hadley Climatic Research Unit of the University of East Anglia (HadCRU), although much of the HadCRU data comes from the GHCN (Global Historical Climate Network) of US National and Oceanic Administration (NOAA) at the National Climatic Data Centre (NCDC) and NASA-GISS (Goddard Institute of Space Studies). The UK-based HadCRU provides the following description: 
 Over land regions of the world, over 3000 monthly station temperature time series are used. Coverage is denser over more populated parts of the world, particularly the United States, southern Canada, Europe and Japan. Coverage is sparsest over the interior of the South American and African continents and over the Antarctic. The number of available stations was small during the 1850s, but increases to over 3000 stations during the 1951-90 periods. For marine regions sea surface temperature (SST) measurements taken on board merchant and some naval vessels are used. As the majority come from the voluntary observing fleet, coverage is reduced away from the main shipping lanes and is minimal over the Southern Oceans.”
 Stations on land are at different elevations and different countries estimate average monthly temperatures using different methods and formulae. To avoid biases that could result from these problems, monthly average temperatures are reduced to anomalies from the period with best coverage (1961-90). For stations to be used, an estimate of the base period average must be calculated. Because many stations do not have complete records for the 1961-90 periods, several methods have been developed to estimate 1961-90 averages from neighbouring records or using other sources of data. Over the oceans, where observations are generally made from mobile platforms, it is impossible to assemble long series of actual temperatures for fixed points. However it is possible to interpolate historical data to create spatially complete reference climatologies (averages for 1961-90) so that individual observations can be compared with a local normal for the given day of the year.
 It is important to note that the HadCRU station data used by the IPCC is not publicly available – neither the raw data nor the adjusted data - only the adjusted gridded data (i.e. after adjustments are made and station anomalies are averaged for the 5x5 degree grid) are available. Temperature anomalies of stations of each 5×5 degree grid for each year are separately averaged to get grid’s yearly data after doing some adjustments and weighting before finding out station anomalies. Then all the grids anomalies of a hemisphere for each year are averaged to get hemispherical yearly anomaly data. Then the two hemispheres data are averaged to find out global single average annual temperature anomaly. By plotting these global annual temperature anomalies on graphs, the trend of temperature will come up.
 Different agencies use different methods for calculating a global average after adjustments being made to the temperatures. As already mentioned, in HadCRU method (used by IPCC), anomalies are calculated based on the average observed in the 1961 – 1990 period (thus stations without data for that period cannot be included). For the calculation of global averages, the HadCRU method divides the world into a series of 5 x 5 degree grids and the temperature is calculated for each grid cell by averaging the stations in it. The number of stations varies all over the world and in many grid cells there are no stations. Both the component parts (land and marine) are separately interpolated to the same 50 x 50 latitude/longitude grid boxes. Land temperature anomalies are in-filled where more than four of the surrounding eight 50 x 50 grid boxes are present. Weighting methods can vary but a common one is to average the grid-box temperature anomalies, with weighting according to the area of each 50 x 50 grid cell, into hemispheric values; the hemispheric averages are then averaged to create the global-average temperature anomaly. The IPCC deviates from the HadCRU method at this point – instead the IPCC uses “optimal averaging. This technique uses information on how temperatures at each location co-vary, to a given time.” Thus empty grid cells are interpolated from surrounding cells. Another method is to calculate averages by averaging the cells within latitude bands and then averaging all the latitude bands.
 In GISS “A grid of 8000 grid boxes of equal area is used. Time series are changed to series of anomalies. For each grid box, the stations within that grid box and also any station within 1200 km of the centre of that box are combined using the ‘reference station method’. A similar method is also used to find a series of anomalies for 80 regions consisting of 100 boxes from the series for those boxes, and again to find the series for 6 latitudinal zones from those regional series, and finally to find the hemispheric and global series from the zonal series.” These are the methods of finding the global temperature anomaly series plotting of which on graph will show the rising or falling trends.
DEFECTS IN METHODS OF DETERMINING GLOBAL WARMING:
 There are some genuinely untenable theoretical and technical grounds behind temperature determining methods applied by IPCC, HadCRU and GISS. The earth’s average surface temperature is not simply a mathematical calculation. This is not a single temperature of the earth as a whole that recorded from sky or elsewhere from space or high atmosphere. Satellite recording of earth’s temperature started from 1979 and from that time in satellite recorded data it is observed no sign of rising of temperature or global warming. So depending only on earth’s average surface temperature data determined only by mathematical calculating methods with wide range of theoretical and technical negative limitations, how can we unequivocally accept the IPCC’s decision on global warming? Limitations in data availability, required quantum of data, considerations in wide range of geographical, climatic and seasonal variability, limitations in methods of averaging and weighting, limitations in grid making or area concepts, locational factors of data acquisition, proxy in absence of data, adjustment of data on unknown basis, misrepresentation and interpretation of historical data, cartographic misrepresentation and scale question, data manipulation,  data mismanagement, data suppression,  limitations in theoretical and computer models of forecasting or prediction, limitations in interpretation and explanations, instrumental limitations, etc. a good number of genuine limitations are there which raise the question of tenability and unequivocal acceptability of the Global Warming theory. There is no counting of head or voting politics and method for acceptability in science. Let us categorically discuss in the following points.
1. Data availability is a major defect of the theory. Temperature recording stations of GISS are not evenly distributed over the world. The US has the highest and densest number of stations and world areas between 300N and 600N has the highest coverage (more than 69% of whose about half in US). Other areas have sparse number of station except some patches and the vast oceans are mostly devoid of recording stations (fig. 2). So how can the temperature measurement be global? The temperature for US, if warming is allowed will be more reliable than other parts of the globe or the globe as a whole. The following table-1 shows the distribution of stations by latitude band.                                    
                                                            Table-1
In addition to sparseness and distribution of stations (fig.2), the number of stations is also changing from time to time. The land coverage slowly increased from 10% of 1880s to 40% in   1960s and then continuously decreasing in the recent years (fig. 3).
                                                      
 Fig-2 World distribution of temperature recording stations of GISS (NASA)                
Fig-3  yearly no. of stations (in c shows % of hemisphere areas within 1200 km from a station equal to almost twice the area of 5x5 degree grid box of HadCRU). Source: NASA-GISS [http://data.giss.nasa.gov/gistemp/station_data/].
 The following figure shows the variation in the number of stations (a) in the GHCN from 1850 to 1997 and the variation in the global coverage of the stations as defined by 5° ´ 5° grid boxes (b). There was a major disappearance of recording stations in the late 1980’s and early 1990’s. 
            
                                   Fig.4- Existence of recording stations in time-scale
The following fig.5 compares the number of global stations in 1900, 1970s and 1997 showing the increase and then decrease. The 1997 figure below shows the station coverage of stations that can be used as of 1997 – i.e. the blank areas do not have coverage in recent times.
                                           
                                                                              
                               
            Fig.5- Comparison of Available GHCN Temperature Stations over Time.
 The University of Delaware has an animated movie of station locations over time. In addition, many stations move locations and some stop collecting data during periods of war (or for example, during the Cultural Revolution in China) – leading a major problem of discontinuities.
The following figure shows the number of stations in the GHCN database with data for selected years, showing the number of stations in the United States (light) and in the rest of the world (ROW – dark). The percents indicate the percent of the total number of stations that are in the U.S.
       
                             





Fig.6- Comparison of Number of GHCN Temperature Stations in the US versus Rest of the World.
 The following fig.7 shows a calculation of straight temperature averages for all of the reporting stations for 1950 to 2000. While a straight average is not meaningful for global temperature calculation (since areas with more stations would have higher weighting), it illustrates that the disappearance of so many stations may have introduced an upward temperature bias. As can be seen in the figure, the straight average of all global stations does not fluctuate much until 1990, at which point the average temperature jumps up. This observational bias can influence the calculation of area-weighted averages to some extent.  A study by Willmott, Robeson and Feddema ("Influence of Spatially Variable Instrument Networks on Climatic Averages, Geophysical Research Letters vol 18 No. 12, pp2249-2251, Dec 1991) calculated a +0.2C bias in the global average due to pre-1990 station closures.
                       
Fig.7-Calculation of Average Temperatures from Reporting Stations for 1950 to 2000.
 Cartographic scale used gives some wrong impression and also some misrepresentations there. Vertical scales are much greater than horizontal scale which gives wrong impression to data about vertical distribution showing much larger image of slightest variation and change. If horizontal and vertical scales are same or nearer, then the impression is almost found correct. Again some dissimilar vertical scales are seen used for comparative purposes, then that case will not serve the purpose. All know these, so it is just an intentional.
Temperature measurement stations must be continually re-evaluated for suitability for inclusion due to changes in the local environment, such as increased urbanization, which causes locally increase of temperatures regardless of the external environmental influences. Thus only rural stations can be validly used in calculating temperature trends. But our stations are mostly urban. As a result, adjustments are made to temperature data at urban stations as discussed later paras.
 There is substantial debate in the scientific community regarding the use of various specific stations as well as regarding the factors that can affect the uncertainty involved in the measurements. For example, the Pielke et al paper available is a recent (Feb 2007) publication by 12 authors describing the temperature measurement uncertainties that have not been taken into sufficient consideration.
 The Surface Stations web site is accumulating physical site data for the temperature measurement stations (including photographs) and identifying problem stations -- there are a significant number of stations with improper site characteristics, especially in urban areas.
2. Geographical, altitudinal, latitudinal, climatic and seasonal variations become meaningless and in some cases immaterial when earth’s only one yearly generalised average surface temperature or temperature anomaly is considered. Is it justifiable? The earth is a sphere warming through the sunlight only from one side mostly fixed perpendicularly over low latitudes, will occur latitudinal strict variations. In this cas latitudinal average is meaningful and may be allowed but among latitudes with wide range of temperature variability is meaningless and can never be allowed. Again there are altitudinal variations i.e. high altitudes low temperature like high latitudes low temperature. These cannot be averaged or generalised except very limited local cases. Another is the climatic variability and contrasts like dry – humid, desert – polar/alpine, continental – coastal/oceanic, etc. cannot be bound together for a generalised average value of temperature. Again the seasonal variability of temperature of a station cannot be averaged into only a generalised yearly single average value which will have at all no meaning. Surface and time (yearly seasonal cycle) factors have made wide range of variations in case of the earth and considering in generalized single value, it will be meaningless and unscientific.
3. Limitations in applied temperature determining scientific methods is another drawback of the theory.  The NASA-GISS processes the GHCN data through a series of data adjustments to calculate global temperatures using a different method than HadCRU. The following figures compare the GISS 2005 temperature anomalies with the HadCRU – HadCRUT data. GISS uses a 1200-km smoothing which creates an artificial expansion of data into areas without data. Thus the Arctic shows more (artificial) warming in the GISS data than in the HadCRUT. Grey areas  are areas without data.                                                                               
Fig.8-  Artificial warming of Arctic area due to adjustment by proxy by GISS (smoothing) and HadCRU (?).
 4. Complete absence of recording stations and data in large number of land grids of HadCRU or GISS or required length of time series including the base 1961-1990 in stations of most number of grids and especially the oceanic regions are mostly in absence of data except some very ingenuine and irregular data taken from shuttling ships, are main drawbacks. Interestingly these blank areas are filled up through proxy data using different methods by different agencies. These are already discussed in ‘global temperature data-source and method’ point. HadCRU, GISS and IPCC use different methods to fill up blank temperature anomaly grids by proxy with certain technique. HadCRU uses ‘interpolation method’ from four grid values out of eight surrounding a grid if available, for both component part of land and ocean areas separately. IPCC uses ‘optimal averaging method’ which uses informations on how temperature in each location co-varies to interpolate the data. GISS uses ‘reference station method’ which combines the stations within the grid and stations 1200 km extension around the centre of that grid for interpolation. Thus warming is made unequivocally(?) acceptable to all through filling up data in blank grid boxes at large scale by theoretical proxy ironically.
5. Adjustment of data is another drawback of Global Warming, the technique of which is not disclosed by IPCC till date. The adjustments made are in many fields. GISS’s 1200km smoothing is an adjustment to cover up areas where there is no data as mentioned in point four above which creates artificial warming in areas like the Arctic region. Adjustments are also made in  historical temperature data-set on unknown basis. Area edit (then called Raw), time of observations (TOBS), equipments change (MMTS), station history adjustment (SHAP), fill missing data (FILNET), urban warming adjustment (urban), etc. are the fields the adjustments made on GHCN data. But the adjustments find no logical solid ground. The following examples published in www.apinsys.com/GlobalWarming will give idea of the secrecy of adjustment.
             Fig.9- The Northern Hemisphere average temperature  from National Geographic – November, 1976.      Fig.10 Northern Hemisphere average temperature from the Met Office Hadley Centre.
   
        Fig.11- Combination (superimposition) of the above two figures. A warming   revision occurred in 1958 in HadCRU study.
The temperature data recorded from the stations is not simply used in the averaging calculations: it is first adjusted. Different agencies use different adjustment methods. The station data is adjusted for e.g. homogeneity (i.e. nearby stations are compared and adjusted if trends are different. As an illustration of the sometimes questionable effects of temperature adjustments, consider the United States data (almost 30 percent of the world’s total historical climate stations are in the US; rising to 50 % of the world’s stations for the post-1990 period). The following graphs show the historical US data from the GISS database as published in 1999 and 2001. The graph on the left was produced in 1999 (Hansen et al 1999) and the graph on the right was produced in 2000(Hansenetal2001)[http://pubs.giss.nasa.gov/docs/2001/2001 Hansen_etal.pdf). They are from the same raw data – the only difference is that the adjustment method was changed by NASA in 2000.
Fig.12-U.S. Temperature Changes Due to Change in Adjustment Methods (Left: 1999, Right 2001)
 The following figure compares the above two graphs, showing how an increase in temperature trend was achieved simply by changing the method of adjusting the data. Some of the major changes are highlighted in this figure – the decreases in the 1930s and the increases in the 1980s and 1990s.                                                                 Fig. 13 -Comparison of U.S. Temperature Changes Due to Change in Adjustment Methods. Source: given below  and see for more information on Hansen’s data manipulations. (http://www.appinsys.com/globalwarming/Hansen_GlobalTemp.htm)
 There are many examples to prove that data adjustment is made only to set a rising trend of temperature of particular grid opposed to the real normal or declining trend of temperature. It is thought and accepted to be an apriori that if there is a equilibrium or declining trend, it is wrong and tried to adjusted to get a rising correct trend. It is also observed that most of the recording stations are located in urban or nearby urban centres which always shows a rising trend are adjusted to get rural type specially in the Northern hemisphere and here also the trend is kept. Example may be cited as in the following.
 Temperature station adjustments are theoretically supposed to make the data more realistic for identifying temperature trends. In some cases the adjustments make sense, in other cases – not. Temperature adjustments are often made to U.S. stations that do not make sense, but invariably increase the apparent warming. The following figure shows the closest rural station to San Francisco (Davis - left) and closest rural station to Seattle (Snoqualmie – right). In both cases warming is artificially introduced to rural stations. (See: http://www.appinsys.com/GlobalWarming/GW_Part3_UrbanHeat.htm for details on the Urban Heat Island effects evident in the surface data)                           
Fig. 14- Artificial Warming Trends in Adjustments to U.S. Rural Stations (upper real and lower graphs adjustment).
 Here is an example where the adjustment makes sense. In Australia the raw data for Melbourne show warming, while the nearest rural station does not. The following figure compares the raw data (lower graph) and adjusted data (upper graph)for Melbourne. However, this seems to be a rare instance.

                   
       Fig. 15- Comparison of Adjusted and Unadjusted Temperature Data for    Melbourne, Australia.
 The following graph is more typical of the standard adjustments made to the temperature data – this is for Darwin, Australia (upper – unadjusted, lower – adjusted). Warming is created in the data through the adjustments.
                
Fig. 16- Comparison of Adjusted and Unadjusted Temperature Data for Darwin, Australia

The following figures show a more recent example of the GISS re-adjustment of data (from: Bob Tisdale at http://i44.tinypic.com/29dwsj7.gif). The 2000 and 2009 versions of the GISTEMP data are compared. This shows the additional artificial warming trend created through data adjustment.
          fig.17- the 2000 (right) and 2009 (left) version of GISTEMP data.      
 
  Fig. 17- superimposition of GISTEMP shows artificial trend out of adjustment.
Since 2000, NASA has further “cleaned” the historical record. The following graph shows the further warming adjustments made to the data in 2005. The data can be downloaded at http://data.giss.nasa.gov/gistemp/graphs/US_USHCN.2005vs1999.txt   the following graph is from http://www.theregister.co.uk/2008/06/05/goddard_nasa_thermometer/print.html.
This figure plots the difference between the 2000 adjusted data and the 2005 adjusted data. Although the 2000 to 2005 adjustment differences are not as large as the 1999 to 2000 adjustment differences shown above, they add additional warming to the trend throughout the historical record.
              
      Fig. 18- differences between adjusted data of temperature of 2000 & 2005.
  After adjustments, the urban stations exhibit warming. The following figures compare the adjusted (lower light graph) with the unadjusted data (upper left graph) for Wellington (left) and Christchurch (right). These adjustments introduce a warming trend into urban data that show no warming in original.         
 Fig.19- Comparison of Adjusted and Unadjusted Temperature Data for Wellington (left) and Christchurch (right).
 Even  Auckland ( below left) and Hokitika ( right), listed as rural, ends up with a very significant warming trend.
     
Fig.20- Comparison of Adjusted and Unadjusted Temperature Data for Auckland and Hokitika
 Adjustments to the data show how all of New Zealand (which exhibits no warming) ends up contributing to “global warming” – the graphs following show unadjusted (left) and adjusted (right) for Auckland, Wellington, Hokitika and Christchurch.

   
Fig. 21- Comparison of Adjusted and Unadjusted Raw Temperature Data
Thus, there are many extra problems with IPCC (HadCRU) methods while HadCRU / IPCC uses an interpolation method for 5x5 degree grids that have no stations. Siberia provides an example of the flaws involved. The following figure shows 5 x 5 degree grids with interpolated data as used by the IPCC, showing the temperature change from 1976 to 1999. Some Siberian 5x5 grids are highlighted in the upper-right rectangular box in the following figure. These are the area of 65 – 80 latitude x 100 -135 longitudes. This illustrates the effect of selecting a particular start year and why the IPCC selected 1976.
                
Fig.22- IPCC Warming from 1976 to 1999 in 5x5 degree grid cells [from Figure 2.9 in the IPCC Third Annual Report (TAR)].
 The NOAA National Climatic Data Center (NCDC) has an animated series showing the temperature anomalies for July of each year from 1880 to 1998 (no interpolation into empty grids). Source : http://www.ncdc.noaa.gov/img/climate/research/ghcn/movie_meant_latestmonth.gif. The images in the following figure are from that series. These give an indication of the global coverage of the grid temperatures and how the coverage has changed over the years, as well as highlighting 2 warm and 2 cool years. The 1930’s were a very warm period (compare 1936 in b with 1998 in d below).

                    Fig.23-Temperature Anomalies for 5x5-Degree Grids for Selected Years    (GHCN data)
 In the GHCN data shown above, grid boxes with no data are left empty. In the IPCC method, many empty grid boxes are filled in with interpolations, with the net effect of increasing the warming trend. The following figure shows temperature trends for the Siberian area highlighted previously (Lat 65 to 80 - Long 100 to 135). Of the eight main temperature dots on the IPCC map, three are interpolated (no data). Of the five with data, the number of stations is indicated in the lower left corner of each grid-based temperature graph. The only grid with more than two stations shows no warming over the available data. The average for the entire 15 x 35 degree area is shown in the upper right of the figure. Of the eight individual stations, only two exhibit any warming since the 1930’s (the one in long 130-135 and only one of the two in long 110-115). An important issue is to keep in mind is that in the calculation of global average temperatures, the interpolated grid boxes are averaged in with the ones that actually have data. This example shows how sparse and varying data can contribute to an average that is not necessarily representative.
                 Fig.24-Temperatures for 5x5 Grids in Lat. 65 to 80 – Long. 100 to 135 Satellite Data
6. Satellite Data - Satellites have more recently been used to remotely sense the temperature of the atmosphere, starting in 1979. The following figure shows the satellite data for Jan 1979 – Jan 2008 (left) and for Jan 2001 – Jan 2008(right).[http://www.worldclimatereport.com/index.php/2008/02/07/more-satellite-musings/#more-306]. A consistent warming trend is not displayed, as it should be if CO2 were playing its hypothesized role. Satellite data is of somewhat limited use due to its lack of long-term historical data – many locations show warm periods in the 1930s-40s, but satellite data starts in 1979.                  
                                                    fig. 25

The following figure shows the global satellite temperature anomalies since satellite data first started to become available in 1979. From 1979 to 1997 there was no warming. Following the major El Nino in 1997-98, there was a residual warming and since then, no warming. All of the warming occurred in a single year.

   
Fig. 26-source: http://www.appinsys.com/GlobalWarming/SatelliteTemps.htm

 Global warming is not global. What is the meaning of a global average temperature? Global warming is not uniform on the globe and has a distinct North / South variance with cooling in the Southern Hemisphere There are also major differences in regions within the hemispheres.
 A recent paper by Syun-Ichi Akasofu at the International Arctic Research Center (University of Alaska Fairbanks) provides an analysis of warming trends intheArctic.[http://www.iarc.uaf.edu/highlights/2007/akasofu_3_07/index.php ] It is interesting to note from the original paper from Jones (1987, 1994) that the first temperature change from 1910 to 1975 occurred only in the Northern Hemisphere. Further, it occurred in high latitudes above 50° in latitude (Serreze and Francis, 2006). The present rise after 1975 is also confined to the Northern Hemisphere, and is not apparent in the Southern Hemisphere; … the Antarctic shows a cooling trend during 1986-2005 (Hansen, 2006). Thus, it is not accurate to claim that the two changes are a truly global phenomenon”. The following figure shows satellite temperature anomaly data for the three world regions of Northern Hemisphere, Tropics and Southern Hemisphere - warming has only been occurring in the Northern Hemisphere.
          fig. 27- Two hemispherical and tropical temperature trends. Source:  http://www.appinsys.com/GlobalWarming/GW_NotGlobal.htm for details on this.
Thus many examples may be cited from many scientific works available in internet and research works.. On this back ground the global warming of IPCC cannot be valid. Proxy, manipulation, adjustment, etc. make data unreal and based on that data,  the works will be unreal.
Manipulation in historical data of temperature cited from vostok ice sheet and tree ring proxy are also made. Again there was a strong secret suppression of  scientists’ data, findings, and opinion and denial of peered review from IPCC which was discovered while the hacking of HadCRU server occurred by somebody for two times-  in 2009 and in 2011. Today in 2013 also, the IPCC 5th assessment report (draft) is also leaked where it is found that they are going to revise their game-plan and going to give more stress in solar forcing in temperature rising and to give less stress in climatic disturbances.
7. Justification  of the Proposed Effect of Global Warming of IPCC :
IPCC shows many current and long standing effects of climate change due to global warming. Sea level change, melting of ice and retreat of glaciers, climatic disturbances, etc. are main. Let us see whether these are correct or not.
Sea level change:   IPCC in global warming try to alarm the public with threatening images of melting glaciers, huge chunks of ice breaking off the Antarctic and Greenland ice shelves, and rising ocean levels. But the predicted rise in ocean levels is trivial compared to the 400 feet they have risen in the last 18,000 years without any help from burning of fossil fuels or any other human contribution to CO2.
Melting of ice and glaciers: Nautical records show ice shelves have been breaking off for centuries, long before the rise in atmospheric CO2 or the world became industrialized. And polar ice is not disappearing. The West Antarctic Ice Sheet lost two-thirds of its ice mass since the last ice age but is now growing. Side-looking interferometry shows it is now growing at a rate of 26 billion tons a year. How can this be when there are pictures of huge ice chunks breaking off and melting? While ice is disappearing at the perimeter, it is piling up inland. Most of the Antarctic ice is above 4,000 feet. As the ice increases there, it pushes the glaciers toward lower elevations at the edge of the continent, where they break off. The only part of Antarctica that is warming is the peninsula, which is furthest from the South Pole and comprises only 2 percent of Antarctica—but it is the part the news media focuses on when they talk about global warming in Antarctica. They never mention the other 98 percent that is getting colder, as can be seen from the measurements of the British meteorological stations there, which can easily be found on the internet.

At the top of the globe, the western Arctic is warming due to unrelated cyclical events in the Pacific Ocean while the eastern Arctic and Greenland are getting colder. According to a letter from Myron Ebell (quoted in TWTW of Feb. 3, 2007 at
http://sepp.org/), the chairman of the Arctic Climate Impact Assessment “redefined the Arctic in order to show a bigger warming trend and cut off the temperature record before 1950 so that they wouldn't have to explain why it was at least as warm in the 1930s as today in the Arctic (the reason claimed is a hoot: there weren't enough weather stations before 1950—even though there were more than in recent decades).” The recent proposal to list the polar bear as “threatened” mentions areas of open water in the Arctic that were frozen solid 30 years ago. But these same areas were reported as open water by explorers in the early 20th century. These areas subsequently froze during several decades and have now merely returned to their previous condition. The Greenland ice mass has thickened by seven feet since it was first measured by laser altimetry in 1980 and continues to grow.
What about the glaciers that are melting? For some glaciers around the world, historical records exist of their lengths over centuries. An intriguing study is “Extracting a Climate Signal from 169 Glacier Records” by J. Oerlemans (
Science, April 29, 2005). As you can see in this chart from his work, glaciers have been receding since 1750, with the trend accelerating after about 1820. Henry Ford began assembly line production in 1913, but by then half of the glacier loss from 1800 to 2000 had already occurred. And 70 percent of the glacier shortening occurred before 1940, that is, before worldwide industrialization and the increase in atmospheric CO2 that we are told is causing the glaciers to melt down to today’s level as fig. 28 follows.
                               fig. 28- length of glaciers through time
Though glaciers have been receding worldwide, they have not retreated back to their locations in the Medieval Warm Period. The Aletsch and Grindelwald glaciers (Switzerland) were much smaller between 800 and 1000 AD than today. The latter glacier is still larger than it was in 1588 and earlier years. In Iceland today, the Drangajokull and Vatnajokull glaciers are far more extensive than in the Middle Ages, and farms remain buried beneath their ice. This is so called “unprecedented” global warming, the “greatest threat to mankind,” or the disappearance of glaciers being “the worst in thousands of years” because of increases in carbon dioxide in recent decades.
Climatic disturbances: Violent weather of IPCC is not so getting worse. Climate alarmists claim the global warming may increase severe weather events. There is absolutely no evidence of increasing severe storm events in the real world data. The Accumulated Cyclone Energy (ACE) is the combination of a storm's intensity and longevity. Global hurricane activity has continued to sink to levels not seen since the 1970s. During the past 60 years Northern Hemisphere ACE undergoes significant inter-annual variability but exhibits no significant statistical trend. The northern hemisphere 2008 ACE was 66% of the 2005 ACE as shown in the stacked bar chart.
Causes of global warming : 1. CO2 especially Anthropogenic CO2 of Green House Gases (GHG) is being mainly  held responsible for global warming. CO2 comprises only 0.035 percent of our atmosphere and is a very weak greenhouse gas. Although it is widely blamed for greenhouse warming, it is not

                            Northern Hemisphere Hurricane Activity (ACE)

                                                Fig.29
the only greenhouse gas or even the most important. Water vapor is by far the most important greenhouse gas, accounting for 97 or 98 percent of any greenhouse effect. The remainder is due to carbon dioxide, methane, and several other gases. Furthermore, of the tiny percentage that CO2 contributes to the greenhouse effect, 97 percent of that is due to nature, not man. Termites, for example, produce CO2 emissions many times that of all the factories and automobiles in the world. ( Science, Nov. 5, 1982.) Combining the factors of water vapor and nature's production of CO2, we see that 99.9 percent of any greenhouse effect has nothing to do with carbon dioxide emissions from human activity. So how much effect could regulating the tiny remainder have upon world climate? Then, too, keep in mind that: (1) anthropogenic emissions of CO2 are only one percent of the atmospheric reservoir of CO2; (2) they are an even smaller percentage of the reservoir of 40,000 billion tons of carbon in the oceans, dissolved as CO2 and in other forms; (3) the oceans receive large quantities of CO2 from volcanic emissions bubbling up from the ocean floors, most significantly from the Mid-Atlantic Ridge; and (4) the oceans are by far the dominant source of atmospheric CO2, with the equatorial Pacific alone contributing 72 percent of atmospheric CO2. Now, is it credible that these vast, global processes of nature can be altered by mankind's puny emissions of CO2? The global processes are so colossal as to overwhelm any human contribution. Furthermore, as Dr. Arthur B. Robinson has explained, “The turnover rate of carbon dioxide as measured by carbon 14 is too short to support a human cause [for a rise in CO2].” And Antarctic ice cores show increases in carbon dioxide follow increases in temperature—not the other way around. One can’t have a cause-and-effect relationship where the effect precedes the cause but that is the real case in this relationship (fig.33).

The earth's temperature has risen about 1 degree F. in the past century. This is not “global warming” but normal fluctuation. The climate is always changing, and one would be hard pressed to find a century when the change did not amount to a degree or more in either direction. But temperature changes within the past century do not correlate with CO2 emissions. Most of the one degree temperature rise of the past century occurred before 1940, while 82 percent of the CO2 entered the atmosphere after 1940. From 1940 until 1975, carbon dioxide was strongly increasing but global temperatures cooled, leading to countless scare stories in the media about a new ice age commencing.
 The graph following shows the temperature changes of the lower troposphere from the surface up to about 8 km as determined from the average of two analyses of satellite data. The UAH analysis is from the University of Alabama in Huntsville and the RSS analysis is from Remote Sensing Systems. The two analyses use different methods to adjust for factors such as orbital decay and inter-satellite difference. The best fit line from January 2002 indicates a declining trend. Surface temperature data is contaminated by the effects of urban development. The Sun's activity, which was increasing through most of the 20th century, has recently become quiet, causing a change of trend. The ripple line shows the CO2 concentration in the atmosphere, as measured at Mauna Loa, Hawaii. The ripple effect in the CO2 curve is due to the seasonal               
      
                                                              Fig. 30
changes in biomass. There is a far greater land area in the northern hemisphere than the south that is affected by seasons. During the Northern hemisphere summer there is a large uptake of CO2 from plants growing causing a drop in the atmospheric CO2 concentration. Cool periods in 1984 and 1992 were caused by the El Chichon and Pinatubo volcanic eruptions. The temperature spike in 1998 was cause by a strong El Nino. Natural climate change is much stronger than any effect from carbon dioxide.

February 2010 satellite data- global warming is not global:
February 2010 was reported to have the warmest global average. February anomaly since satellite data began in 1979, as shown in the following figure from http://www.drroyspencer.com/latest-global-temperatures/.

                                                              Fig.31
As shown in the following figure, most of the world had normal (white & light ) or below normal (light dark) temperatures. The global average is affected by one small area of the Arctic having much higher than normal temperature                                  
                                              Fig. 32
Correlations with the broader historical record are even more out of whack. During the Late Ordovician Period of the Paleozoic Era, CO2 levels were 12 times higher than today. According to greenhouse theory, the earth should have been really hot—but instead it was in an Ice Age! And now we are told that the planet will overheat if CO2 doubles. Carbon di oxide is a weak greenhouse gas. Computer simulations predicting environmental catastrophe depend on the small amount of warming from CO2 is being amplified by increased evaporation of water. Water vapor is a strong greenhouse gas. But in  many documented periods of higher CO2, even during much warmer temperatures, no such catastrophic amplification occurred. So there is no reason to fear the computer predictions—or base public policies on them. They are clearly wrong.
 From Antarctic Vostok ice core records as evidence that CO2 causes climate change, it is found that the cause and effect reversed. The record actually shows that the CO2 increase lagged the warming by about 800 years. Temperature increases cause the oceans to expel CO2, increasing the CO2 content of the atmosphere.
        
     Fig.33- The ice core data proves that CO2 is not a primary climate driver.
Since the greenhouse gas theory cannot explain global temperature changes, what does? The sun with cosmic rays from beyond our solar system also contributes. Everyone knows the sun heats the earth, but that heat is not uniform. “Sunspot” cycles vary solar intensity. These correlate extremely well with shorter term cycles in global temperatures. Dr. Baliunas of the Harvard-Smithsonian Center for Astrophysics has extended the correlation back another hundred years by using the sun's magnetic cycle as a proxy for its brightness (irradiation.) Longer and more severe swings in global climate, such as the ice ages, correlate with changes in the earth's orbit around the sun. Clouds have a hundred times stronger effect on climate than does carbon dioxide. A one percent increase in cloud cover would offset a doubling of atmospheric CO2. Yet from 1988 to 1990, cloud cover increased by 3 percent. And what determines cloud cover? The sun, through variations in cosmic rays and solar wind. In the words of Dr. Theodor Landscheidt of Canada's Schroeder Institute, “When the solar wind is strong and cosmic rays are weak, the global cloud cover shrinks. It extends when cosmic rays are strong because the solar wind is weak. This effect [is] attributed to cloud seeding by ionized secondary particles.”


                                                                                                                 
                                                                fig.34
Mars is undergoing global warming. Clearly this cannot be explained by the popular CO2 answer. What could possibly be causing Martian warming if not the sun? And if that's what is happening on Mars, why not on earth? After all, we share the same sun. Why is there such stubborn adherence to the CO2 hypothesis despite its failures? In June 2006 the journal Nature published three separate papers on an expedition that extracted sediment samples just 50 miles from the North Pole. Based in part on specimens of algae that indicate subtropical or tropical conditions, the scientists determined that 55 million years ago the Arctic Ocean had a balmy Florida-like year-round average temperature of 74 degrees F. Warming of this magnitude could not have been produced by carbon dioxide, but scientists cling tenaciously to the popular but failed explanation. An article in the New York Times describes this Arctic discovery as offering insight into “the power of greenhouse gases to warm the earth.” It quotes several scientists as saying they still support the idea of greenhouse gases determining the planet's warming or cooling, even though they admit they don't understand what happened here.

8. Politics behind global warming: Why there is a reluctance to admit the sun is responsible for changes in global climate when there is such strong evidence is not known. Why is there such emphasis on CO2 when the human contribution of it is trivial and water vapor is so much more important in greenhouse effect? Same answer to both questions: governments can only control people, not nature. If the sun is responsible for climate change, then there is nothing governments can do about it. If water vapor is the key to greenhouse warming, then there is nothing governments can do about it. For government to be relevant on this issue, it must have a cause that can be blamed on people, because people are the only thing government can control. And if government is not relevant on this issue, then there is no need for those political appointees from 150 nations to the IPCC. Nor is there a need for all the government grants to all the scientists and institutions for studies that keep trying to prove that increases in CO2 are causing global warming, in order to validate government intervention. Nor is there a justification for spending other people's money (taxpayer funds) for such purposes. Nor is there a need for the bureaucrats and governmental framework to study, formulate and implement regulations for controlling CO2 emissions, for extending the role of government over every aspect of people's lives. H.L. Mencken once said, “The urge to save humanity is almost always a false front for the urge to rule it.”
There is a world intergovermental (International?) politics behind global warming and climate change. It is the effect of cold war against socialist world waged by the capitalist and imperialist world. It was started as action against working class of England and then extended to general attack on working people and toiling masses of the world. The inception of the idea is very interesting. In the United States, the mass media devoted little coverage to global warming until the drought of 1988, and James E. Hansen's testimony to the Senate, which explicitly attributed "the abnormally hot weather plaguing our nation" to global warming. The British press also changed its coverage at the end of 1988, following a speech by Margaret Thatcher to the Royal Society advocating action against human-induced climate change. According to Anabela Carvalho, an academic analyst, Thatcher's "appropriation" of the risks of climate change to promote nuclear power, in the context of the dismantling of the coal industry following the uncertain effect of 1984-1985 miners' strike was one reason for the change in public discourse. At the same time environmental organizations and the political opposition were demanding "solutions that contrasted with the government's". Many European countries took action to reduce greenhouse gas emissions before 1990. West Germany started to take action after the Green Party took seats in Parliament in the 1980s. All countries of the European Union ratified the 1997 Kyoto Protocol. Substantial activity by NGOs took place as well. Both "global warming" and the more politically neutral "climate change" were listed by the Global Language Monitor as political buzzwords or catchphrases in 2005. In Europe, the notion of human influence on climate gained wide acceptance more rapidly than in the United States and other countries. A 2009 survey found that Europeans rated climate change as the second most serious problem facing the world, between "poverty, the lack of food and drinking water" and "a major global economic downturn". Eighty-seven per cent of Europeans considered climate change to be a very serious or serious problem, while ten per cent did not consider it a serious problem.
The United States supports global warming but denies accepting the Kyoto protocol and carbon budgeting because they apprehend a great loss to US development in industries while accepting the protocol. The govt. and non-govt. agencies of US like GISS are working with IPCC and also they are helping and backing by funding NGOs and helps to promote Govt. executive and judiciary policies of different countries of the world for protection of environment and also in the works UNO for this type of activities. The EEU and England, the UNO all are taking side of global warming pressing to turn the world politics towards environmental protection to make it the main political agenda amidst longstanding and ever deepening world economic recession and depression for which the world toiling masses are suffering greatly. The countries of the world are also taking part due to GATT, emerging of MNCs, TNCs and attempt to come over their own deepening economic crisis through not solving it, but diverting issues of development to environment in the name of safe of mankind. Citizen bodies, scientists and environment related NGOs are also in the same line. They are adopting all policies based on environmental determinism and environmentalism. But it is to be remembered that the environmentalism is   “-----------------anti-scientific reactionary trend-----------. The reason for its tenacity of life is the tendency of bourgeois author to appeal to geographical environment or biological laws for proof of the suitability or unsuitability ( in accordance with the interests of the bourgeoisie at a given moment) of historically complex social relation”—Bryeterman.
Authors note and acknowledgements: Most of the information, write ups, graphs, citations, etc. are directly taken from different web-sites of IPCC, GISS, GHCN, etc. and scientists writings and added directly or indirectly. The author seeks apology for use of them and acknowledges to different websites, authors, scientists, etc. that could not be mentioned here properly due to problem of space.
                                                ------------------------------------------