www.vidyaonline.org/dl/JPNaik_01.pdf by JP NAIK -
Procedure followed in Decision Making Process - Government of ...
tourism.gov.in/RTIA/ChapterView.aspx?CNO=9&TCNO=
General search from google.com
C.N. Lehri Review on Constitution of India 2012
AMY Report on UPA Government 2012 General Procedures of Parliament: Analytical & Survey Review 2011
*******
DISCRETE STUDY OF LUSTER OF METALS : SPECIAL REFERENCE TO THE COMPARISION OF SOME PARTICULAR METALS
ANOOP CHAUDHARY
RESEARCH SCHOLAR PHYSICS: VINAYAKA MISSION UNIVERSITY T.N.
ABSTRACT
As per the scientific definition a thing has luster if it shines and is brilliant. Diamond, Gold, Silver, Brass are such metals having the property of Luster. All has the quality of shining. The research is based on the study of luster of metals with the comparison and properties. For this study some metals having luster properties are comprised to result the statistical analysis.
|
INTRODUCTION
To process the research, we have three metals having luster properties and that is Gold, Silver and Brass. Lustre varies in a wide continuum, and therefore has no rigid boundaries between the different terms.
For knowing the specific contents about the luster properties of metals the research work is being progressed. For the comparative study and knowing the luster property of metals there is not any consolidated reference material so far. For taking any information about it many reference sources are consulted. The current work will be helpful for future study of Luster properties of metals.
REVIEW
As the property of Luster of any metal is known to almost many people who are concerned with it and many works are done through different sources and for different uses. Therefore many reference materials are already available in the market in many ways to get the theory of Luster and other related things. If we talk about the previous work many industries or academy are having the information about the luster of metal but in different forms such as books or other official records.
Toxicology of Metals By Gary L. Hook, PhD, MPH, CIH, Peter T. LaPuma, Ltc, lustrous and, in the transition metals, typically ductile and malleable.
Copper Healing and Metaphysical Properties of Gemstones from ...
21 Feb 2011 Wheels of Life by Anodea Judith, PhD, 2000
Dave's Mechanical Pencils: Paper Mate PhD Ultra Mechanical Pencil.
27 Apr 2010. The PhD in metal trim and high luster plastic.
Collaboration at the PhD thesis of J. Dessureault-Rompré, ... Nowack B, Schulin R, Tercier-Waeber M-L, Luster J. Metal solubility and speciation
PhD in Mineralogy and. Economic Geology by Mr. Coltman, are the previous works in the concerned field. Current research work is having the inspiration contents from the previous work and also a step in the advancement for future.
OBJECTIVES
There are several objectives of the research on the above topic
-
To spread the knowledge of Luster Property of metals
-
To motivate the researcher for the advancement of current technology
-
To make an advance work with the combination of previous in same field.
-
To spread awareness about the use and adoption meals having luster properties.
-
To make the thesis in such a way, so that further new research work can be done as an advance version of this work.
-
To identify the comparison in metal’s property of luster
-
To motivate the researcher or inventors for making advance technology
-
To produce the consolidated material for getting the knowledge about the properties and comparison of metals having luster
NEED AND SCOPE OF STUDY
As to get the combined and complete knowledge of metals for its luster property, we have to refer many data through different sources, but no data is available in a consolidation format. The all contents of this research work will be a solution for the same.
In future for doing advance research on the related topic one must have full database about it, therefore the current work will be a unique collection of database for all references.
In many newly established industries or upcoming workers, who are intended to pursue working in the metal field, there will always be a need to have the specific reference material to consult the many points, therefore the work which is to be done in this topic will be helpful for those.
Uses, Maintenance, furnishing storage of metals for future use will also be specifically explained. Therefore the research work is having a wide scope for future concerned works.
METHODLOGY
In the first phase of research a brief introduction about the metals having luster properties will be discussed. The definition of Luster and other related matter will also be explained. With different metals the ration and uses of metals with comparative study will be described. With help of mathematical analytical analysis, images and histogram the comparative study will be shown
To know the concept and comparison images and scalar part help us, therefore in the thesis all the pictorial formats with graph and histogram will be added. Final part of the thesis will comprise conclusion theory and reference part.
CONCLUSION
To get the knowledge for luster properties and the comparison among the metals this work will be very helpful. There may be a possibility of making book publication for the current work so that it could become a compulsory reference matter for the students, researchers or any related candidate, as the thesis will comprising a consolidated database for the title. After the completion of this thesis the awareness of knowing current technology will be increased.
Researchers who are continuing the research with concerned topics may have some contents through this thesis. The work will be more useful for the people involving in finding and inventing new technologies for the uses, maintenance, restoring of the metals. There may be some concerned research in the world going on with some related topics and for the advancement and the source database the current research is also useful.
BIBLIOGRAPHY
A.N.T References : Metal Properties 2009
AIEER, Gwalior Overview : Physics 2008
Dr. A.P. Bhadaoriya : Generalized theory on luster of metals
Metals and Nonmetals - Difference between Metals and Nonmetals - What Are t ...
chemistry.about.com/od/elementgroups/a/metalloids.htm - Cached - Similar
Silver Facts - Periodic Table of the Elements
Silver Facts. Chemical & Physical Properties. By Anne Marie Helmenstine, Ph ...
chemistry.about.com/od/elementfacts/a/silver.htm - Cached - Similar
Show more results from about.com
Minerals II
Properties. by Anne E. Egger, Ph.D. Key Concepts hide. Properties that help geologists identify a mineral in a rock are: color, hardness, luster, ...
www.visionlearning.com › Library › Earth Science
Properties. by Anne E. Egger, Ph.D. Key Concepts hide. Properties that help geologists identify a mineral in a rock are: color, hardness, luster, ...
www.visionlearning.com › Library › Earth Science
Physical Properties Of Metals And Non-metals, Metals And Non ...
To summarize: metals are electropositive in nature, lustrous, malleable, ... A detailed comparison of properties of metals and non-metals is given in table. ...
www.tutorvista.com/.../metals...metals/physical-properties-metals-non-metals.php - United States
Metals, Nonmetals and Metalloids
Metalloids. Properties intermediate between the metals and nonmetals. Silicon for example appears lustrous, but is not malleable or ductile (it is brittle ...
www.mikeblaber.org/oldwine/.../notes/.../Metals/Period06.htm - Cached - Similar
Nonmetals - Properties of Element Groups
Solid nonmetals are generally brittle, with little or no metallic luster. ... Metalloids or Semimetals - Properties of Element Groups · Metals and Nonmetals.
chemistry.about.com/od/elementgroups/a/nonmetals.
Mineral Identification Key Mineral Properties & Luster
Brief discussions of the most important properties follow below. ... Non-metallic, not looking like a metal at all. Nonmetallic luster is divided into ...
www.minsocam.org/msa/collectors.../mineral_id_keyi3
Pursuit Of The Properties Of Metals And Nonmetals | Free Lesson ...
Define the concepts of color, luster, malleability, and conductivity with your students. Next, review the properties of metals, nonmetals, and semimetals. ...
www.discoveryeducation.com
Metal Guide: AVR overview on metals 2009 volume 9
*****
General Introduction to Migration of Charge in DNA
Jasmin Jadon: Research Scholar Biotechnology:
Introduction: It is a well known fact in the science of Biotechnology that charge migration is a very important process in biomolecules and other large polymeric systems. It has been found that charge migration in proteins or similar systems is highly efficient, but the mechanistic origin is still not well understood even though various models have been advanced (1, 2).Recently we have demonstrated that the photoionization of positive charge at a specific chromophore in a series of polypeptides can lead to facile migration of this charge over long peptide chains (3–5); we have studied neutral peptides of natural amino acids of the type (X)n-Y (n = 1, 2, 3, where Y denotes the aromatic amino acid) in the gas phase. They have been prepared by laser desorption and supersonic cooling. Local ionization is performed by resonant laser excitation in aromatic acid Y located at the C terminus. Subsequent UV photofragmentation of the cation is shown to directly reflect the prior charge migration in these large molecules. The charge is initially localized at the chromophore in the form of an electronic hole in its ground electronic state of the cation. After photoexcitation of the cation, the electron is promoted into a charge transfer (CT) state (or by a photoexcitation to a localized state followed by internal conversion to a CT state) and the new hole thus created can hop between local sites in the chain.
Estimation: The distance and direction of the charge migration is determined to a first order by the ionization energies of the individual amino acids, rather than the ionization potential of the entire supramolecule (6). It has been found that the charge migration can be blocked by as small a local barrier as 0.2–0.3 eV (differences in ionization potentials between different neighboring amino acids). In this work, we shall propose a model that is compatible with experiments and supported by ab initio (6) and molecular dynamics (7) calculations. Because of the importance of DNA damage and its repair, diverse biophysical and biochemical studies have sought to understand the electron transfer (ET) in DNA (8–11). The strong resemblance of the base-pair stack of DNA to conductive one-dimensional aromatic crystals has prompted the proposal that long-range charge transport might proceed through DNA. Recently Zewail and co-workers (8) have reported with femtosecond resolution the direct observation in DNA of ultrafast ET, initiated by excitation of tethered ethidium, the intercalated electron acceptor.
Conclusion: The long-range charge hopping in DNA (ref. 19 and the references therein). In our theoretical treatment of long-range charge migration in proteins and DNA, the torsional motion of floppy backbones is emphasized to play a very important role in the hole hopping between local amino acid sites in proteins. We have derived the generalized master equations which can describe the time evolution of the charge migration (and/or other dynamical processes) in complex systems.
CONTRIBUTION OF PROPERTIES AND EFFECTS OF GLUCOSE IN OUR LIFE
By : Santosh Kumar Dakhle
Research Scholar: CMJ University, Shillong, Meghalaya
The question in the field of chemistry often arises that why glucose—and not another monosaccharide such as fructose—is so widely used in organisms is not clearly understood. One reason might be that glucose has a lower tendency, relative to other hexose sugars, to react non-specifically with the amino groups of proteins. This reaction (glycation) reduces or destroys the function of many enzymes. The low rate of glycation is due to glucose's preference for the less reactive cyclic isomer. Nevertheless, many of the long-term complications of diabetes (e.g., blindness, renal failure, and peripheral neuropathy) are probably due to the glycation of proteins or lipids.[5]In contrast, enzyme-regulated addition of glucose to proteins by glycosylation is often essential to their function. Another reason as to why glucose is the most common sugar is that it is the most conformation stable among other possibilities.
Non-invasive blood glucose determination has been investigated by more than 100 research groups all over the world during the past fifteen years. Many measurement methods are based on the capacity of near-infrared light to penetrate a few millimetres into human tissue where it interacts with glucose.
A change of glucose concentration may change the optical parameters in tissue, with the result that its glucose concentration can be extracted by analysing the received optical (or other) signals. This paper applies a streak camera in conjunction with the photoacoustic (PA) technique to study glucose determination on the based on optical scattering effect and on skin properties.
The near-infrared region, the scattering change, rather than the change of absorption or thermal physical parameters of glucose dominates the PA response of glucose in whole blood. However, the skin in vivo tests show that other factors, such as physiological change, blood circulation, water content, and body temperature drift, may interfere with PA response of glucose in noninvasive measurement. This is due to skin to be high scattering, neither homogeneous nor simply multilayered, so that any change in its morphology will cause the scattering change in it.
In glucose measurements, PA technique has higher detection sensitivity than near infrared absorption method. The effect of glucose scattering can increase PA response. However, from the viewpoint of detection sensitivity, PA method is not the best choice for optical scattering measurement or non-invasive blood glucose determination based on the optical scattering effects, because PA mechanism is relative to the total effects of optical absorption and thermal expansion, rather than optical scattering directly.
Future research work should focus on finding better scattering-related method or larger glucose-induced optical effects in tissue. Glucose is a ubiquitous fuel in biology. It is used as an energy source in most organisms, from bacteria to humans. Use of glucose may be by either aerobic respiration, anaerobic respiration, or fermentation. Glucose is the human body's key source of energy, through aerobic respiration, providing approximately 3.75 kilocalories (16 kilojoules) of food energyper gram.[6] Breakdown of carbohydrates (e.g. starch) yields mono- and disaccharides, most of which is glucose. Through glycolysis and later in the reactions of the citric acid cycle (TCAC), glucose is oxidized to eventually form CO2 and water, yielding energy sources, mostly in the form of ATP. The insulin reaction, and other mechanisms, regulate the concentration of glucose in the blood.
Reference
-
V.; Migarskaya, L. B. (1960), "Heats of combustion of some amino-acids", Russ. J. Phys. Chem. (Engl. Transl.)
-
Boerio-Goates, Juliana (1991), "Heat-capacity measurements and thermodynamic functions of crystalline α-D-glucose at temperatures from 10K to 340K", J. Chem. Thermodynam. .
-
^ Clark, D.; Sokoloff, L. (1999), Basic Neurochemistry: Molecular, Cellular and Medical Aspects.
-
"Dextrose", Merriam-Webster Online Dictionary
-
^ High Blood Glucose and Diabetes Complications: The buildup of molecules known as AGEs may be the key link, American Diabetes Association.
6. S. Nissilä and O. Ahola, et al., Medical Sensors IV and Fibre Optics Sensor III, Proceedings
7. F. A. Duck, Physical properties of tissue: a comprehensive reference book, Academic Press Inc., San Diego, 1990.
*******
FOR ANY ASSISTANCE REGARDING ARTICLE OR RESEARCH MAKING
GET ASSISTANCE THROUGH MAILING airoofficials@gmail.com
Volume – III
GENERAL THEORY OF SPECTROSCOPY AND CHROMATROGRAPHY
FOR CHARACTERIZATION IN VARIOUS SOLUTIONS
By: Subhash Ram Dabar
Research Scholar : CMJ University, Shillong, Meghalaya
The two major areas of the analytical sciences are Spectroscopy and Chromatography. Spectroscopy covers a wide range of techniques for chemical analysis at the atomic and molecular level. Chromatography is a general term for techniques that separate, identify and quantify compounds.
Spectroscopy is the interactions of matter with radiation particularly electromagnetic region such as UV visible and infrared light. The range of light analysis techniques includes absorbance, reflection, emission, scattering and refractive index. The physical quantity measured is either of energy absorbed or energy produced.
Absorption Spectroscopy includes atomic absorption and molecular techniques, such as infrared and ultraviolet-visible. Emission Spectroscopy includes fluorescence and one of the main examples of Scattering Spectroscopy, is Raman Spectroscopy. Other commonly used spectroscopic analysis techniques are Circular Dichroism , Dual Polarisation Interferometry and Laser Spectroscopy.
Reaction Kinetics are usually detected and recorded by spectroscopic techniques such as UV absorbance, fluorescence or circular dichroism. One of the main rapid kinetics techniques is stopped-flow, commonly used with fluorescence detection due to its high sensitivity.
High pressure liquid chromatography. The sample to be analysed is dissolved in a liquid which is pumped through a column packed with particles. As the analyte traverses the length of the column it is slowed by specific chemical or physical interactions with the particles. When the analyte emerges from the column it passes through a detector which shows a peak with a retention time characteristic for that analyte. Traditional HPLC is performed with columns packed with particles usually around 5 μm diameter. Smaller particles (<2 μm) provide more surface area with significant increases in resolution, speed, and sensitivity.
The development of chemistry is based on finding out the composition of various compounds. Not just the chemical aspect of it but the physical parameters and properties as well. The branch of chemistry which throws light on the various aspects involved in characterizing an element, compound or mixture is known as analytical chemistry. The importance of Analytical chemistry is the determination of the chemical composition of matter. This is done by the identification of a substance, the elucidation of its structure and its composition. These are the aspects covered by analytical chemistry, which involve a wide variety of equipment and techniques
Analytical chemistry is an interdisciplinary branch of science wherein a large number of inputs from different branches of sciences have contributed to its development. For instance, most of the chromatographic methods were invented by biochemists, or biological scientists, while contributions by physicists account for nuclear magnetic resonance and mass spectroscopy etc.
References:
Chromatography General Chemistry 2001
Chromatography General Chemistry 2002
Chromatography General Chemistry 2003
Chromatography General Chemistry 2004
Chromatography General Chemistry 2005
Chromatography General Chemistry 2006
AIEER REVIEWNAL CHEMISTRY 2009
AIEER REVIEWNAL CHEMISTRY 2010
AIEER REVIEWNAL CHEMISTRY 2011
AIEER REVIEWNAL CHEMISTRY 2012
WWW.WICKYPEDIA.COM
*******
IMPROVISATION OF OLD PATTERNED MATERIALS IN MODERN HOUSES OF INDIA
Madan Lal Gupta : Research Scholar Civil Engineering
ABSTRACT
These days’ low cost and high safety materials are in demand in almost all countries of the world. The maximum loss to life and property in the Asian Region, due to frequent occurrence of natural disasters, dictates the need for the evolution of safer habitat, which can respond and resist the loads, forces and effects due to the natural disasters. This becomes imperative in the context of huge socioeconomic loss to nations. Therefore, all efforts are to be promoted and nurtured for safer building construction to take care of normal loads and forces and the effects of natural disasters. This can only happen when an enabling environment is created for effective techno-legal and techno financing regime for effective building regulatory mechanism for creating safe habitats. With large part of the communities belonging to low income strata, with people driven construction processes, appropriate grass root level technology transfer initiatives should be put in place for creating awareness, appreciation and application models for using disaster resistant and cost effective building technologies.
Introduction
-
In the developing countries such as our India, the technology in civil construction is growing and getting new inventions day by day. In this continuation the use of Chowkhats Blocks, Ash Bricks Roof Tiles and Interlocking Wall Blocks are the new additions in the field of construction. This research is based on the promotion and expansion of the use of prefabricated building component using fly ash.
-
The “Prefabrication” is the manufacture of an whole building or components cast in a factory or on site before being placed in position, assembling the structural units so that they can be easily and rapidly erected. Prefabricated buildings are pre-cut, pre-drilled, and pre-engineered before the actual building is constructed. Prefabricated Structures (PFS) are useful for sites, which are not suitable for normal construction methods such as hilly regions, and also when normal construction materials are not easily available.
Reviews
The information supplied in this report is obtained from the reliable sources.
ii Fly ash is a fine, glass-like powder recovered from gases created by coal-fired electric power generation. Flyash material is solidified while suspended in the exhaust gases and is collected by electrostatic precipitators or filter bags. Since the particles solidify while suspended in the exhaust gases, flyash particles are generally spherical in shape and range in size from 0.5 µm to 100 µm. They consist mostly of silicon dioxide (SiO2), aluminum oxide (Al2O3) and iron oxide (Fe2O3).
selection of fly ash brick manufacturing technology should be based on the availability of raw materials, financial strength of the entrepreneur, and the market characteristics (size and nature of applications, i.e. required strength and quality of the bricks / blocks etc.).
-
Clay fly ash brick technologies should be encouraged for conversion of large number of clay fired brick units and urgent steps taken is low investment and minimum changes from status quo are required.
-
Advantages of prefabrication are:
-
1. In prefabricated construction, as the components are readymade, self supporting, shuttering and scaffolding is eliminated with a saving in shuttering cost.
-
2. In conventional methods, the shuttering gets damaged due to its repetitive use because of frequent cutting, nailing etc. On the other hand, the mould for the precast components can be used for large number of repetitions thereby reducing the cost of the mould per unit.
-
3. In prefabricated housing system, time is saved by the use of precast elements which are casted off-site during the course of foundations being laid. The finishes and services can be done below the slab immediately. While in the conventional in-situ RCC slabs, due to props and shuttering, the work cannot be done, till they are removed. Thus, saving of time attributes to saving of money.
Recommendations
-
1. Risk and security have to be defined in the context of location specific vulnerabilities,
-
2. Safety and security of people can be ensured only when the affected communities determine their priorities and control the use of resources
-
3. Building innovations that are evolved by the poor to cope with the crisis can be sustained by mainstream institutional structures if they are supported with resources and capacities.
-
4. Standard safety norms and building codes are developed (in relation to local skills, materials and resources)
-
5. Alternative institutional arrangements to ensure resources allocated for the widespread awareness of safe building techniques.
References
-
[1] “Faster Production of Stone Blocks and Concrete Blocks‟, CBRI-Annual Report, 1999- 2000.
-
[2] Garg R.K., `Sustainable Human Settlements and Cost Effective Housing Technologies.‟ BMTPC
-
[3] Garg R.K., Garg N. K. & Batra Y. K.(2004), Sanitation and Waste Water Disposal Systems inRural Areas, Journal of Indian Building Congress, Vol. 11, No. 2, 2004; Seminar on “Up gradation of Housing & Amenities in Rural Areas”, Bhubaneswar, December, 22nd -23rd2004. BMTPC.ACSGE-2009, Oct 25-27, BITS Pilani, India
-
AIEER Civil Engineering Reviews 2007
-
AIEER Civil Engineering Reviews 2008
-
AIEER Civil Engineering Reviews 2009
-
AIEER Civil Engineering Reviews 2010
-
Recycle Construction Analysis Gaurav Jain Volume 1
-
Recycle Construction Analysis Gaurav Jain Volume 2
-
Recycle Construction Analysis Gaurav Jain Volume 3
-
Recycle Construction Analysis Gaurav Jain Volume 4
-
Recycle Construction Analysis Gaurav Jain Volume 5
-
Recycle Construction Analysis Gaurav Jain Volume 6
-
Recycle Construction Analysis Gaurav Jain Volume 7
-
Recycle Construction Analysis Gaurav Jain Volume 8
-
Recycle Construction Analysis Gaurav Jain Volume 9
-
Prefabricated structures –a case study” by Charl .h, Jevan.P vol .6, 2001, “structural engineering journal”.
-
Prefabricated structures by Charan singh.
-
www.prefabricate.com
-
Rajiv Gupta Latest Building Construction Techniques : Volume 1
-
Rajiv Gupta Latest Building Construction Techniques : Volume 2
*******
Modern Techniques and Implementation of Latest Updates on Nanostructures & Nanomaterials
Bharat Singh Jayant : Research Scholar CMJ University
Nanomaterials is a field that takes a materials science-based approach on nanotechnology. It studies materials with morphological features on the nanoscale, and especially those that have special properties stemming from their nanoscale dimensions. Nanoscale is usually defined as smaller than a one tenth of a micrometer in at least one dimension,[1] though sometimes includes up to a micrometer.
On 18 October 2011, the European Commission adopted the following definition of a nanomaterial:
A natural, incidental or manufactured material containing particles, in an unbound state or as an aggregate or as an agglomerate and where, for 50% or more of the particles in the number size distribution, one or more external dimensions is in the size range 1 nm – 100 nm. In specific cases and where warranted by concerns for the environment, health, safety or competitiveness the number size distribution threshold of 50% may be replaced by a threshold between 1 and 50%.
An important aspect of nanotechnology is the vastly increased ratio of surface area to volume present in many nanoscale materials, which makes possible new quantum mechanical effects. One example is the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. Nanoparticles, for example, take advantage of their dramatically increased surface area to volume ratio. Their optical properties, e.g. fluorescence, become a function of the particle diameter. This effect does not come into play by going from macro to micro dimensions. However, it becomes pronounced when the nanometer size range is reached.
Nanomaterials and nanostructures are key components in the development of nanomanufacturing technologies leading toward macroscopic system innovations that directly benefit society. They find extensive applications in the areas of electronics, optoelectronics, photonics, photovoltaics, catalysis, energy storage, sensors, environmental science, and biomedical systems.
Georgia Tech faculty are actively involved in related research, utilizing the state-of-art cleanroom and characterization facilities at the IEN to accomplish their research goals. Ongoing research by GT faculty involves synthesis of nanomaterials, emphasizing the relationships between the atomic- and microstructure, study of material properties and characterization, processing and performance of nanomaterials, development of nanogenerators and self-powered nanosystems, and graphene nanoelectronics. Active research also includes the potential applications of nanomaterials in nanomedicine, plasmonics, and nanocatalysis. In the area of biomedical engineering, research is focused Further examples of faculty research include the use of diamond coatings as a bio-barrier to prevent the body from attacking implants, and nanomaterials in tissue engineering.
***
Current Techniques and Improvisation of Metaphysics
Anand Swaroop Laharia : Research Scholar CMJ University
Metaphysics is a traditional branch of philosophy concerned with explaining the fundamental nature of being and the world that encompasses it, although the term is not easily defined. Traditionally, metaphysics attempts to answer two basic questions in the broadest possible terms:
-
What is ultimately there?
-
What is it like?
A person who studies metaphysics is called a metaphysicist or a metaphysician. The metaphysician attempts to clarify the fundamental notions by which people understand the world, e.g., existence, objects and their properties, space and time, cause and effect, and possibility. A central branch of metaphysics is ontology, the investigation into the basic categories of being and how they relate to each other. Another central branch of metaphysics is cosmology, the study of the origin (if it had one), fundamental structure, nature, and dynamics of the universe.
Prior to the modern history of science, scientific questions were addressed as a part of metaphysics known as natural philosophy. Originally, the term "science" (Latin scientia) simply meant "knowledge". The scientific method, however, transformed natural philosophy into anempirical activity deriving from experiment unlike the rest of philosophy. By the end of the 18th century, it had begun to be called "science" to distinguish it from philosophy. Thereafter, metaphysics denoted philosophical enquiry of a non-empirical character into the nature of existence.[6] Some philosophers of science, such as the neo-positivists, say that natural science rejects the study of metaphysics, while other philosophers of science strongly disagree.
The world seems to contain many individual things, both physical, like apples, and abstract, such as love and the number 3; the former objects are called particulars. Particulars are said to have attributes, e.g. size, shape, color, location, and two particulars may have some such attributes in common. Such attributes are also termed Universals or Properties; the nature of these, and whether they have any real existence and if so of what kind, is a long-standing issue, realism and nominalism representing opposing views.
Metaphysicians concerned with questions about universals or particulars are interested in the nature of objects and their properties, and the relationship between the two. Some, e.g. Plato, argue that properties are abstract objects, existing outside of space and time, to which particular objects bear special relations. David Armstrong holds that universals exist in time and space but only at their instantiation and their discovery is a function of science. Others maintain that particulars are a bundle or collection of properties (specifically, a bundle of properties they have).
***
Predication of Useful data by applying data Warehousing Techniques
Manish Kumar Shukla:
Research Scholar Computer Science - Shri Venkateshwara University Gajraula, Amroha, (U.P.),
Data mining techniques provide people with new power to research and manipulate the existing large volume of data. Data mining process discovers interesting information from the hidden data which can either be used for future prediction and/or intelligently summarizing the details of the data. There are many achievements of applying data mining techniques to various areas such as marketing, medical, financial, and car manufacturing. In this paper, a proposed data mining application in car manufacturing domain is explained and experimented. The application results demonstrate the capability of data mining techniques in providing important analysis such as launch analysis and slow turning analysis. Such analysis help in providing car market with base for more accurate prediction of future market demand.
The main steps for providing the Slow Turn/Launch analysis are the following
•Receiving and validating the data sources.
•ETL the data sources to the data warehouse.
•Processing the data warehouse to generate the required data marts.
•Building the required data marts and OLAP cubes.
•Generating the reports.
•Delivering the analysis in the form of Excel work books, and PowerPoint slides.
The data sources used for this solution are the followings:
•”Stock/Sales/Orders” data sources are snapshots at the (exact) date of data.
•“Production Plan” data source is a list of models quantities that are planned to be produced for a specific period.
•“TOPs” data source is a structured data which is used to map all car configurations under production.
•“SPOT” data source is used to list dealers, in addition to their rating and their geographic information.
•“Web Activity” data source is used to track all user web hits/requests on DCX websites. The ETL was designed for transforming heterogeneous data sources formats into flat file format in order to load it as bulk insert, to gain higher performance.
Conclusion
A survey on the data mining techniques developed in car market area. A classification of the available data mining techniques was also provided, based on the kinds of databases to be mined, the kinds of knowledge to be discovered, and the kinds of techniques to be adopted. This survey is organized according to one classification scheme: the kinds of knowledge to be mined
REFERENCES
S. Bongsik, “An exploratory investigation of system success factors in data warehousing”, Journal Of The Association For Information Systems, vol. 4, pp. 141-168, 2003.[2]
L. Cabibbo and R. Torlone, “An architecture for data warehousingsupporting data independence and interoperability: an architecture for data warehousing”, International Journal Of Cooperative Information Systems, vol. 10, no. 3, 2001.[3]
D. Calvanese, G. D. Giacomo, M. Lenzerini, D. Nardi, and R. Rosati,“Data integration in data warehousing”, International Journal Of Cooperative Information Systems, vol. 10, no. 3, pp. 237, 2001.[4]
N. Chan, H. Wong, “Data mining of resilience indicators”, IIE Transactions, vol. 39, no. 6, pp. 617-627, 2007.[5]
S. Elkaffas and A. Toony, “Applications of genetic programming in datamining”, Enformatika; vol. 17, pp. 1-5, 2006.
Latest Updates on Data Mining Techniques and Tools
***
CURRENT UPDATES UNDER APPLICATION OF DATA MINING TECHNIQUE IN E-COMMERCE
ASHISH KUMAR: Research Scholar Computer Science - Shri Venkateshwara University Gajraula, Amroha, (U.P.),
The Data is becoming an important part of people’s life. The web is a very good place to run successful businesses. Selling products or services online plays an important role in the success of businesses that have a physical presence, like a retail business. Therefore, it is important to have a successful website to serve as a sales and marketing tool. One of the effective used technologies for that purpose is data mining. Data mining is the process of extracting interesting patterns from large databases. Web mining is the usage of data mining techniques to extract interesting information from web data.
Electronic commerce is changing the face of business. It allows better customer management, new strategies for marketing, an expanded range of products and more efficient operations. A key enabler of this change is the widespread use of increasingly sophisticated data mining tools.
Early efforts at this were based on market segmentation. Businesses attempted to discover clusters of consumers who were similar, and then would develop payment plans, ad campaigns, special discounts and other policies designed for each cluster (especially the most profitable). The data mining tool used for this was cluster analysis, and the most famous commercial pioneer is Claritas, which used Census data to identify 64 “clusters” of consumers, with shorthand descriptors such as “kids and cul-de-sacs” or “money and brains” or “back country folks.” Clusters get revised periodically to reflect important changes; for example, Claritas recently added the cluster “young digerati” to reflect the important technophile segment.
References
Bradlow, E. T. and Schmittlein, D. C. (2000). The little engines that could: Modeling the
performance of the World Wide Web search engines. Marketing Sci. 19 43–62.
Chatterjee, P., Hoffman, D. L. and Novak, T. (2003). Modeling the clickstream: Implications for web-based advertising efforts. Marketing Sci. 22 520–541.
Clyde, M. and George, E. I. (2004). Model uncertainty. Statist. Sci. 19 81–94. MR2082148
Dobra, A. and Fienberg, S. E. (2003). How large is the World Wide Web? In Web Dynamics
(M. Levene and A. Poulovassilis, eds.) 23–44. Springer, New York.
Donoho, D. L. and Huber, P. J. (1983). The notion of breakdown point. In A Festschrift for
Erich L. Lehmann (P. Bickel, K. Doksum and J. Hodges, eds.) 157–184. Wadsworth, Belmont,
CA. MR0689745
Dumais, S. (1991). Improving the retrieval of information from external sources. Behavior Research Methods, Instruments, and Computers 23 229–236.
Fellegi, I. P. and Sunter, A. B. (1969). A theory for record linkage. J. Amer. Statist. Assoc.
64 1183–1210.
Fienberg, S. E. (2006). Privacy and confidentiality in an e-commerce world: Data mining, data
warehousing, matching and disclosure limitation. Statist. Sci. 21 143–154.
Friedman, J. H. and Popescu, B. E. (2005). Predictive learning via rule ensembles.
Web address: http://arxiv.org/pdf/math/0609204.pdf
***
STUDY OF REASONS AND PROBLEMS RECTIFICATION IN PROJECT DEVELOPMENT UNDER DATA MINING
Dhyan Chandra
Research Scholar Computer Science - Shri Venkateshwara University Gajraula, Amroha, (U.P.),
There are many achievements of applying data mining techniques to various areas such as marketing, medical, financial, and car manufacturing. In this paper, a proposed data mining application in car manufacturing domain is explained and experimented. The overall study of the concerned title reveals that data mining techniques provide people with new power to research and manipulate the existing large volume of data. Data mining process discovers interesting information from the hidden data which can either be used for future prediction and/or intelligently summarizing the details of the data. The application results demonstrate the capability of data mining techniques in providing important analysis such as launch analysis and slow turning analysis. Such analysis help in providing car market with base for more accurate prediction of future market necessity.
In summary, without business knowledge, not a single step of the data mining process can be effective; there are no “purely technical” steps. Business knowledge guides the process towards useful results, and enables the recognition of those results that are useful. Data mining is an iterative process, with business knowledge at its core, driving continual improvement of results.
The second aspect is making the data more informative with respect to the business problem – for example, certain derived fields or aggregates may be relevant to the data mining question; the data miner knows this through business knowledge and data knowledge. By including these fields in the data, the data miner manipulates the search space to make it possible or easier for their preferred techniques to find a solution.
There are 5 factors which contribute to the necessity for experiment in finding data mining solutions:
1. If the problem space were well-understood, the data mining process would not be needed – data mining is the process of searching for as yet unknown connections.
2. For a given application, there is not only one problem space; different models may be used to solve different parts of the problem, and the way in which the problem is decomposed is itself often the result of data mining and not known before the process begins.
3. The data miner manipulates, or “shapes”, the problem space by data preparation, so that the grounds for evaluating a model are constantly shifting.
4. There is no technical measure of value for a predictive model (see 8th law).
5. The business objective itself undergoes revision and development during the data mining process, so that the appropriate data mining goals may change completely.
***
Modern Scenario of Cryptography and Message Security
Bhanu Prasad Vishwakarma : Research Scholar Computer Science - Shri Venkateshwara University Gajraula, Amroha, (U.P.),
The requirements of a security regime which addresses these risks are conventionally summarized under four broad headings. For the sender and receiver to be confident in the outcome of their communications, all of these requirements need to be satisfied.
(1) 'Confidentiality', or Message Content Security
This comprises two separate requirements, that, during a message's transit from sender to receiver:
-
no observer can access the contents of the message; and
-
no observer can identify the sender and receiver.
The term 'confidentiality' is used by computer scientists who specialize in security matters. This is most unfortunate, because the term has an entirely different meaning within commerce generally, which derives from the law of confidence. For this reason, the alternative term 'message content security' is used in this Module.
(2) Integrity of Message Content
This requires that the recipient can be sure that, whether accidentally, or because of an action by any party:
-
the message has not been changed or lost during transmission;
-
the message has not been prevented from reaching the recipient; and
-
the message has not reached the recipient twice.
(3) Authentication of the Sender and Recipient
This requires that:
-
the sender can be sure that the message reaches the intended recipient, and only the intended recipient; and
-
the recipient can be sure that the message came from the sender and not an imposter. The act by an imposter of sending such a message is referred to as 'spoofing'.
(4) Non-Repudiation by the Sender and Recipient
This requires that:
-
the sender cannot credibly deny that the message was sent by them; and
-
the recipient cannot credibly deny that the message was received by them.
-
Symmetric cryptography involves a single, secret key, which both the message-sender and the message-recipient must have. It is used by the sender to encrypt the message, and by the recipient to decrypt it.
-
The NSA stated in the mid-1990s that a 40-bit length was acceptable to them (i.e. they can crack it sufficiently quickly). Increasing processor speeds, combined with loosely-coupled multi-processor configurations, are bringing the ability to crack such short keys within the reach of much less well-funded organizations. To be 'strong', the key-length therefore needs to be at least 56 bits in 1998, and it was argued by an expert group as early as 1996 that 90 bits is a more appropriate length.
-
Symmetric cryptography provides a means of satisfying the requirement of message content security, because the content cannot be read without the secret key. There remains a risk exposure, however, because neither party can be sure that the other party has not exposed the secret key to a third party (whether accidentally or intentionally).
Conclusion
If a person or organization loses their private key, they are unable to:
-
encrypt messages with their private key; and
-
read messages sent to them encrypted with their own public key.
If the private key is stored only on a person's workstation or chip, it is vulnerable to theft, loss, damage and malfunction of that device. In order to address those risks, it is strongly advisable that every private key be placed into deposit in some location separate from that person's normal workstation. Because of the risks involved if the private key comes to be known by some other person, the deposit needs to be subject to an appropriately high set of security standards.
If, however, security is to be sustained (and, indeed, if privacy is to be protected), any access to escrowed keys would need to be subject to very careful designed and implemented controls, e.g. a prior requirement of legal authority (such as a search warrant), granted by a senior member of the judiciary.
If key escrow is implemented, it might be:
-
voluntary;
-
voluntary for individuals but mandatory for corporations;
-
mandatory for all users; or
-
mandatory for dealings with government.
and the function might be performed by:
-
one or more individuals and/or service organizations of the key-owner's choice;
-
a service organization which must be licensed, and which, as a condition of the license, has to satisfy certain conditions; or
-
a specified government agency or agencies.
Volume VI AIRO NATIONAL JOURNAL ISSN : 2321-3914
Review, Comparison & Modern Changes in British & American English
Submitted by : TANU PRIYA Research Scholar English Page: 78
An Introduction to Grobner Bases
Submitted by Rajwinder Kaur Research Scholar Mathematics Page: 82
Womanhood in the light of The Life Divine
By Umesh Prasad Research Scholar English Jabalpur Madhya Pradesh India Page 89
Implementation & Analysis on Cloud Computing Process & Grid Architecture
Submitted by :SHEETAL Research Scholar Computer Sc. Page 93
Role of Micro-bacteria in Cleaning of the Organic Pollutants from Yamuna River
Submitted by SAT PAUL Research Scholar ZOOLOGY Page 95
LITERARY CRITICISIM AND TRADITION IN ELIOT’S POETRY”
Submitted by :Dr. Sarita Research Scholar English Page 100
The Electronic Nose
Submitted by : Dr. S. Viswanadha Raju Professor, Department of Computer Science & Engineering
JNTU, Hyderabad.
&
M. Naveen Kumar Research Scholar Ph.D in Computer Science Dravidian University Page 103
Review, Comparison & Modern Changes in British & American English
Submitted by : TANU PRIYA Research Scholar English
Introduction
The use of English in the United States was inherited from British colonization. The first wave of English-speaking settlers arrived in North America in the 17th century. During that time, there were also speakers in North America of Spanish, French, Dutch, German, Norwegian, Swedish, Scots, Welsh, Irish, Scottish Gaelic, Finnish, Russian (Alaska) and numerous Native American languages.
The English language was first introduced to the Americas by British colonization, beginning in 1607 in Jamestown, Virginia. Similarly, the language spread to numerous other parts of the world as a result of British trade and colonization elsewhere and the spread of the former British Empire, which, by 1921, held sway over a population of 470–570 million people, approximately a quarter of the world's population at that time.
Although spoken American and British English are generally mutually intelligible, there are occasional differences which might cause embarrassment—for example, in American English a rubber is usually interpreted as a condom rather than an eraser; and a British fanny refers to the female pubic area, while the American fanny refers to an ass (US) or anarse (UK). Likewise the Australian root means to have sexual intercourse whilst in both British and American English it means to support someone for success.
Review of Literature
American English and British English (BrE) differ at the levels of phonology, phonetics, vocabulary, and, to a lesser extent, grammar and orthography. The first large American dictionary, An American Dictionary of the English Language, was written by Noah Webster in 1828; Webster intended to show that the United States, which was a relatively new country at the time, spoke a different dialect from that of Britain.
Differences in grammar are relatively minor, and normally do not affect mutual intelligibility; these include, but are not limited to: different use of some verbal auxiliaries; formal (rather than notional) agreement with collective nouns; different preferences for the past forms of a few verbs (e.g. AmE/BrE: learned/learnt, burned/burnt, and in sneak, dive, get); different prepositions and adverbs in certain contexts (e.g. AmE in school, BrE at school); and whether or not a definite article is used, in very few cases (AmE to the hospital, BrE to hospital). Often, these differences are a matter of relative preferences rather than absolute rules; and most are not stable, since the two varieties are constantly influencing each other.
Differences in orthography are also trivial. Some of the forms that now serve to distinguish American from British spelling (color for colour, center for centre, traveler for traveller, etc.) were introduced by Noah Webster himself; others are due to spelling tendencies in Britain from the 17th century until the present day (e.g. -ise for -ize, although the Oxford English Dictionary still prefers the -ize ending) and cases favored by the francophile tastes of 19th century Victorian England, which had little effect on AmE (e.g. programme for program, manoeuvre for maneuver, skilful for skillful, cheque for check, etc.).
The most noticeable differences between AmE and BrE are at the levels of pronunciation and vocabulary.
While written AmE is standardized across the country, there are several recognizable variations in the spoken language, both in pronunciation and in vernacular vocabulary. General American is the name given to any American accent that is relatively free of noticeable regional influences.
After the Civil War, the settlement of the western territories by migrants from the Eastern U.S. led to dialect mixing and leveling, so that regional dialects are most strongly differentiated along the Eastern seaboard. The Connecticut River and Long Island Sound is usually regarded as the southern/western extent of New England speech, which has its roots in the speech of the Puritans from East Anglia who settled in the Massachusetts Bay Colony. The Potomac River generally divides a group of Northern coastal dialects from the beginning of the Coastal Southern dialect area; in between these two rivers several local variations exist, chief among them the one that prevails in and around New York City and northern New Jersey, which developed on a Dutch substratum after the British conquered New Amsterdam. The main features of Coastal Southern speech can be traced to the speech of the English from the West Country who settled in Virginia after leaving England at the time of the English Civil War, and to the African influences from the African Americans who were enslaved in the South.
Although no longer region-specific, African American Vernacular English, which remains prevalent among African Americans, has a close relationship to Southern varieties of AmE and has greatly influenced everyday speech of many Americans.
A distinctive speech pattern also appears near the border between Canada and the United States, centered on the Great Lakes region (but only on the American side). This is the Inland North Dialect—the "standard Midwestern" speech that was the basis for General American in the mid-20th Century (although it has been recently modified by the northern cities vowel shift). In the interior, the situation is very different. West of the Appalachian Mountains begins the broad zone of what is generally called "Midland" speech. This is divided into two discrete subdivisions, the North Midland that begins north of the Ohio River valley area, and the South Midland speech; sometimes the former is designated simply "Midland" and the latter is reckoned as "Highland Southern." The North Midland speech continues to expand westward until it becomes the closely related Western dialect which contains Pacific Northwest English as well as the well-known California English, although in the immediate San Francisco area some older speakers do not possess the cot-caught merger and thus retain the distinction between words such as cot and caught which reflects a historical Mid-Atlantic heritage.
The South Midland or Highland Southern dialect follows the Ohio River in a generally southwesterly direction, moves across Arkansas and Oklahoma west of the Mississippi, and peters out in West Texas.
2> |
Share with your friends: |