Saturday, August 31, 2019

Biometrics

Biometrics are automated methods of recognizing a person based on a physiological or behavioral characteristic. Among the features measured are face, fingerprints, hand geometry, handwriting, iris, retinal, vein, and voice. Biometric data are separate and distinct from personal information. Biometric templates cannot be reverse- engineered to recreate personal information and they cannot be stolen and used to access personal information. Using a unique, physical attribute of your body, such as your fingerprint or iris, to effortlessly identify and verify that you are who you claim to e, is the best and easiest solution in the market today.That is the simple truth and power of Biometrics Technology today. Although biometric technology has been around for many years, modern advances in this emerging technology, coupled with big reductions in cost, now make biometrics readily available and affordable to consumers, small business owner, larger corporations and public sector agencies alik e. How Does a Fingerprint Optical Scanner Work? A fingerprint scanner system has two basic Jobs it needs to get an image of your inger, and it needs to determine whether the pattern of ridges and valleys in this image matches the pattern of ridges and valleys in pre-scanned images.Only specific characteristics, which are unique to every fingerprint, are filtered and saved as an encrypted biometric key or mathematical representation. No image of a fingerprint is ever saved, only a series of numbers (a binary code), which is used for verification. The algorithm cannot be reconverted to an image, so no one can duplicate your fingerprints. Employees Privacy and Cleanliness Concerns? It is important to note that Easy Clockings biometric time clocks do not actually collect and store fingerprints.Instead, it saves a mathematical representation of the employee's biometric data. When the biometric time clock scans a hand or finger during a supervised enrollment process, only an encrypted mat hematical representation of the fingerprint is stored. As a result, it's virtually impossible to duplicate the original image from that mathematical representation. Additionally, if employees question cleanliness, this concern should not be dismissed. Instead, you should assure employees that the time clocks finger zone is not a hot zone for germs.In fact, it will be touched far less frequently than restroom door handles, water cooler spigots, or chairs in the break room. Top Advantages of Fingerprint Authentication There are several ways an electronic time clock system can verity that somebody is who they say they are. Most systems are looking for one or more of the following: What you have What you know Who you are To get past a â€Å"what you have† system, you need some sort of â€Å"token,† such as an dentity card with a magnetic strip.A â€Å"what you know† system requires you to enter a password or PIN number. A â€Å"who you are† system is actually looking for physical evidence that you are who you say you are a specific fingerprint pattern. â€Å"Who you are† systems like Easy Clocking Fingerprint Time Clocks have a number of advantages over other systems. To name few: Fingerprints are much harder to fake than identity cards. You cant guess a fingerprint pattern like you can guess a password. You can't misplace your fingerprint, like you can misplace an access card. You can't forget your fingerprints like you can forget a password. Conclusion on Biometric & Workforce Management Biometrics has been used effectively for more than a decade for time and attendance and workforce management. Despite widespread use, confusion and misconceptions about the technology and its capabilities persist.These concerns are easily dispelled when the facts about biometrics are established. Biometrics offers unparalleled ability to quickly and accurately capture real-time, labor data and provide a nonrepudiated audit trail. Biometrics has undergone intense scrutiny and the results are in – when properly deployed, biometrics work well and are safe, secure, and accurate. Biometrics offers organizations a broader range of direct and indirect time, cost, and operational benefits than alternative time and attendance methods. Today over one hundred thousand thriving organizations rely on Easy Clockings time & attendance systems to automate their employee attendance and as a result they are seeing a significant reduction in direct and indirect labor costs. Biometrics Biometrics are automated methods of recognizing a person based on a physiological or behavioral characteristic. Among the features measured are face, fingerprints, hand geometry, handwriting, iris, retinal, vein, and voice. Biometric data are separate and distinct from personal information. Biometric templates cannot be reverse- engineered to recreate personal information and they cannot be stolen and used to access personal information. Using a unique, physical attribute of your body, such as your fingerprint or iris, to effortlessly identify and verify that you are who you claim to e, is the best and easiest solution in the market today.That is the simple truth and power of Biometrics Technology today. Although biometric technology has been around for many years, modern advances in this emerging technology, coupled with big reductions in cost, now make biometrics readily available and affordable to consumers, small business owner, larger corporations and public sector agencies alik e. How Does a Fingerprint Optical Scanner Work? A fingerprint scanner system has two basic Jobs it needs to get an image of your inger, and it needs to determine whether the pattern of ridges and valleys in this image matches the pattern of ridges and valleys in pre-scanned images.Only specific characteristics, which are unique to every fingerprint, are filtered and saved as an encrypted biometric key or mathematical representation. No image of a fingerprint is ever saved, only a series of numbers (a binary code), which is used for verification. The algorithm cannot be reconverted to an image, so no one can duplicate your fingerprints. Employees Privacy and Cleanliness Concerns? It is important to note that Easy Clockings biometric time clocks do not actually collect and store fingerprints.Instead, it saves a mathematical representation of the employee's biometric data. When the biometric time clock scans a hand or finger during a supervised enrollment process, only an encrypted mat hematical representation of the fingerprint is stored. As a result, it's virtually impossible to duplicate the original image from that mathematical representation. Additionally, if employees question cleanliness, this concern should not be dismissed. Instead, you should assure employees that the time clocks finger zone is not a hot zone for germs.In fact, it will be touched far less frequently than restroom door handles, water cooler spigots, or chairs in the break room. Top Advantages of Fingerprint Authentication There are several ways an electronic time clock system can verity that somebody is who they say they are. Most systems are looking for one or more of the following: What you have What you know Who you are To get past a â€Å"what you have† system, you need some sort of â€Å"token,† such as an dentity card with a magnetic strip.A â€Å"what you know† system requires you to enter a password or PIN number. A â€Å"who you are† system is actually looking for physical evidence that you are who you say you are a specific fingerprint pattern. â€Å"Who you are† systems like Easy Clocking Fingerprint Time Clocks have a number of advantages over other systems. To name few: Fingerprints are much harder to fake than identity cards. You cant guess a fingerprint pattern like you can guess a password. You can't misplace your fingerprint, like you can misplace an access card. You can't forget your fingerprints like you can forget a password. Conclusion on Biometric & Workforce Management Biometrics has been used effectively for more than a decade for time and attendance and workforce management. Despite widespread use, confusion and misconceptions about the technology and its capabilities persist.These concerns are easily dispelled when the facts about biometrics are established. Biometrics offers unparalleled ability to quickly and accurately capture real-time, labor data and provide a nonrepudiated audit trail. Biometrics has undergone intense scrutiny and the results are in – when properly deployed, biometrics work well and are safe, secure, and accurate. Biometrics offers organizations a broader range of direct and indirect time, cost, and operational benefits than alternative time and attendance methods. Today over one hundred thousand thriving organizations rely on Easy Clockings time & attendance systems to automate their employee attendance and as a result they are seeing a significant reduction in direct and indirect labor costs.

Friday, August 30, 2019

Sections:

Section Four: Software to Support Assessment 1)In a 750-1,000 word essay, draft your proposal to utilize software to support assessment in the classroom as a part of your Comprehensive Classroom Technology Plan (Benchmark Assessment), which is due in Module 7. Consider the following: a)In what ways can technology facilitate the ongoing effort to assess student learning? b)What is the difference between formative assessment and summative assessment and how can technology be used to facilitate both? c)What are the pros and cons of using technology to assess student learning? )Should a teacher only use technology to assess student learning? Why or why not? 2)Support your rationale from your required readings and from three to five peer-reviewed articles from the GCU eLibrary. 3)Prepare this assignment according to the guidelines found in the GCU Style Guide, located in the Student Success Center. An abstract is not required. 4)The instructor will provide commentary on your draft, which you will then use as a basis for revising this section. The revised section will then become part of the complete Comprehensive Classroom Technology Plan (Benchmark Assessment), due in Module 7. Refer to the rubric portion of this resource (below) for grading criteria for this assignment. 5)The draft of this section is due by the end of Module 5. Section Five: Technology Ethics in the Classroom 1)Construct a 750-1,000 word essay drafting your proposal for the ethical uses of technology in the classroom as a part of your Comprehensive Classroom Technology Plan (Benchmark Assessment), which is due in Module 7. 2)Address the following issues in your essay: a)Discuss Internet security and how you would implement this in your classroom to protect the students from Internet predators and any inappropriate material. )Explain the way that you would create proper copyright and student use agreements that could be used in your classroom or in any K-12 classroom. c)Discuss three ethical practices of technology use that you would implement in your classroom and explain their importance in a K-12 classroom. 3)Use the GCU eLibrary to research a minimum of three to five peer-reviewed artic les that can be used in support of your content. 4)Prepare this assignment according to the guidelines found in the GCU Style Guide, located in the Student Success Center. An abstract is not required.

Thursday, August 29, 2019

Summary of God's relationship with Hebrew people Essay

Summary of God's relationship with Hebrew people - Essay Example food and companions to the first couple; God’s punishing of the first couple and serpent; Cain and Abel’s offerings to God; God’s interaction with Noah; God’s second punishment -- the Great Flood; God’s promise to Abraham; and God’s encounter with Jonah and Job. Collectively, God’s relationship with the Hebrews was established through the Hebrew understanding and living of the themes carried out by these Biblical events. Moreover, the relationship was also strongly made out of the fact that Hebrews, just like the other races, came from the same descendants -- the first couple. Specifically, the Hebrew was said to be a race coming from Shem -- one of Noah’s children. The extent to which God’s relationship to the Hebrews domineered any other relationship. This was significantly reflected on the Hebrew’s dedication. This dedication came to fruition through the writing of the Bible. Consequently, the Hebrew people were able to pass God’s will through an emphasis on themes to the next generation of worshippers. Because of this emphasis on themes, time-lines were ignored and contradicting facts ensue (Dj Love, Chapter 15, Study

Wednesday, August 28, 2019

Crosstalk IT Coursework Example | Topics and Well Written Essays - 2000 words

Crosstalk IT - Coursework Example The first type of coupling or conductive coupling is a result of the establishment of physical contact between conductors. Inductive crosstalk results when the current passing through a single conductor results in the generation of a similar current in another conductor. The third type of crosstalk or capacitive coupling is the resultant of the coming together of two or more conductors such that they begin to act as a capacitor. Figure 1: Introduction to Crosstalk. Source: polarinstruments.com Crosstalk may be PREVENTED by enabling the positioning of conductors to be at respective distances. Instead, the introduction of insulation between the conductors may also result in the lessening of crosstalk. Practically it is not possible to separate out wires in distances therefore the option for insulation is better. Crosstalk due to inductive capacitance may be eliminated by twisted the inducing cables around each other. The inter-wound cables are often referred to as twisted pair cables. The way crosstalk gets reduced is due to the reduction of the field size cross section on which crosstalk may occur. Thus electromagnetic interference gets reduced considerably. Digital signals are often affected less or not at all by crosstalk. The signals mostly affected are those that are analogue in nature. The term has been chosen owing to my personal experience of handling issues related to it in my organization. The networking setup had old fixtures within the entire organizational setup and had to be replaced with better cables that proposed reduced interferences and thus lesser possibilities of crosstalk. CAT 6 CABLE Ethernet or LAN to put simply is the means by which the computer systems present within a closed area are connected together into a network via hardware cables. The cables used for the said process are of various standards. CAT 6 Cable is one of them. The technology of the Ethernet is integral for organizational networks and so is the presence of CAT 6 cable in it. Figure 2 Cat 6 Cable. Source: http://www.openxtra.co.uk/articles/category-6-cat6-cable It is given the name CAT 6 because it belongs to the sixth generation of cables that have been made for the Ethernet technology. Specifically it may be said that it supports the following gigabit Ethernet technologies: 10BASE-T/100BASE-TX and 1000BASE-T/1000BASE-TX. The cable consists of eight wires just like its predecessor Cat 5 cable. However it is different from its former version in the manner that it makes use of all its 4 pairs in data transfer. The Category 6 Cable was used in the establishment of the Local Area Network of the Organization for which I work. The selection and successful installation of this cable was my task and that is how I encountered it. This cable was preferred over the rest of the available option because it is the latest version of cables that complies fully with the standard Ethernet Protocols fo Data Transfer and provides efficient data transfer without any da ta loss. MP3 MP3 is a short form of MPEG-2 Audio Layer III. It is a standard format for digital music storage that was designed by Motion Pictures Experts Group (MPEG) around mid-1990s. High quality audio files can be created using this standard with extremely optimized storage space. MP3 format is widely used over the internet to share and transfer audio files. How the magic of MP3 works. The

Tuesday, August 27, 2019

Accidents Resulting from Police High Speed Pursuits And Responses to Research Paper

Accidents Resulting from Police High Speed Pursuits And Responses to Emergency Calls - Research Paper Example In order to exploit the public interests in watching such risky chases by police officers, some television channels are currently telecasting such chases. However, it should be noted that many of these risky chases often end up in severe accidents. Both the offenders and the chasers suffer accidents and severe injuries because of high speed police pursuits. There are plenty of incidents in which lawsuits arise against the government and the police as a result of high speed police pursuits. There is a hot national debate ongoing concerning when, if at all, police should be involved in high-speed pursuits. One side says that police should use their discretion and should not terminate a chase merely because of an increased risk to the public. The opposite view is that by chasing an offender, the police magnify the risk of injury to the general public (Sanderson, n. d., p.2). It is difficult for the police to act as silent witnesses when people violate laws. The purpose of police force is to enforce the implementation of law and order in a society or country. This purpose will not be served if police officers remain idle when people violate law and order. At the same time, it is the duty of the police officers to catch or arrest the offenders in a safe manner. It should be noted that the risky chasing of the offenders may cause danger not only to the offender and the chasers but also to the innocent people. Accidents resulting from responses to emergency calls are also substantial in volume in America. It should be noted that the first hour immediately after the road accidents are critical in saving the lives of the injured people. In an attempt to reach the site as quickly as possible, traffic police often forced to drive their vehicle in high speeds. These high speed driving often end up in unexpected accidents in which both the police and the innocent people suffer injuries or death. In short, unintentional or unexpected accidents due to high speed police pursuits and responses to emergency calls are growing day by day in America and different lawsuits are arising as a result of that. This paper analyses the legal dimensions of accidents caused by high speed police pursuits and responses to emergency calls. Review of Literature â€Å"Police pursuits contribute to traffic violation, loss of officers, and death or injury of innocent people. According to a study by Jeff Martin, fleeing suspects create tragedies despite the efforts to terminate pursuits† (Schultz et al., 2010, p.1). Hoffmann & Mazerolle (2005) pointed out that â€Å"Police high-speed pursuits present a difficult area for police managers and policy makers because of the important need to balance public safety with the mandate to enforce laws† (Hoffmann & Mazerolle, 2005, p.530). Kaminski et al. (2012) argued that â€Å"in vast majority of pursuits, deputies and suspects were uninjured or sustained only minor injuries. In this regard, they do not appear to be any more hazardous than resistive encounters generally† (Kaminski et al., 2012, p.177). However, there are plenty of cases in which the suspects or the police suffer severe injuries and even death as a result of high speed police pursuit. John Hill (2002) pointed out that â€Å"police pursuits result in about 350 deaths per year and the number of pursuits increases each year. Moreover, about 2,500 persons die each year as a result of police pursuits and that another 55,000 are injured† (Hill, 2002, p.14-15). A study conducted by Rivara & Mack in 2004 to determine the motor vehicle crash deaths related to police pursuits yielded the

Monday, August 26, 2019

Gangs & gang behavior - week 8 Essay Example | Topics and Well Written Essays - 500 words

Gangs & gang behavior - week 8 - Essay Example The founder of the gang was a member of Hispanic gang, who was properly oriented and influence by gang warfare. They adhere to the essence of brotherhood, for it is something that could provide them the protection they deserve while they are incarcerated. It is undeniably true that the gang has been largely affected or influenced by Hispanic culture, because of the orientation of its founder. The founder was a Hispanic, adhering to the principles and cultural beliefs of the Hispanic culture. However, as time goes by, the gang has become home to individuals or members coming from diverse culture. However, each member has the chance to tolerate the set culture for the group. As stated, the gang was instituted by Hispanic prisoners. The inception of this group was made possible within the four walls of the correction centre that has become home to thousands of inmates of varying criminal records. As time goes by, the gang has been known as the Mexican Mafia, which split into two groups, the other is adhering to the old philosophy of the gang and the other one considers the group as the New Mexican Mafia. The Mexican Mafia, a Hispanic gang, adheres to the value of alliance, due to the existing rivals with other gangs. This is solely for the purpose of protecting the members from other rivals. The gang has important rules to follow. Homosexuality, snitching, cowardly, fighting among members, disrespecting, stealing and interfering are not acceptable to them. There is no formal constitution within the gang, but for as long as the set rules are met, they are all well with their prevailing system. So far, there is no any hint that the gang are blood in and blood out. In fact, as already stated, there were two prevailing groups for the Mexican Mafia. The new existing gang to separate from the original group has given the chance to choose the

Sunday, August 25, 2019

Antibiotic Sensitivity Testing Essay Example | Topics and Well Written Essays - 1500 words

Antibiotic Sensitivity Testing - Essay Example The principle of the disc diffusion method is that when a filter disc is impregnated with a chemical that is placed on the agar, the chemical in the agar diffuses around the disc. The solubility of the chemical and its molecular size determines the size of the area if chemical infiltration around the disc. When an organism is placed on the agar around the disc, if it is susceptible to the chemical then there would be no growth around the area where the chemical is diffused and this area is called as the "zone o the inhibition". This inhibition can be measured and compared with a standard control strain and normal tables, Stokes and Kirby-Bauer method respectively. The factors that affect in this method are the concentration of the bacterial inoculums, the depth and types of agar, the incubation conditions and the time of incubation. All this factors should be always taken into account while performing the test. An alternative test that can be performed is the determination of the amount of antibiotic required to either inhibit the growth of organism or to kill the organism which is done by incubating a fixed concentration of the organism in increasing concentrations of antibiotics and checking for growth after 24h of incubation. The as3. Methods. The assigned culture was swabbed onto the agar plate in three directions for maximal coverage. The plates were allowed to settle for 10 mins, followed by impregnating the antibiotic discs by pushing the dispenser over the agar. The antibiotic and the disc code were noted and the plates were kept in incubator at 37C. 4. Results. The results obtained from Test 1 and Test 2 are displayed in Table 1 and 2 respectively. RESULTS: TEST 1 RESULTS: TEST 2 5. Discussion: Antibiotic susceptibility of given strain was determined by Kirby-Bauer disc diffusion method. Disc diffusion method is widely used across the laboratory to determine the effect of any anti- microbial agents like antiseptics, antibiotics, bactericidal or bacteriostatic compound etc. To avoid any lab to lab variation standardized protocol was developed that was recommended by National Committee for Clinical laboratory standards like Meuller-Hilton Agar is used as growth medium; plates will be incubated at 37C for 18-24 hr, while turbidity of bacterial suspension will be set at 0.5 McFarland method or 1.0 OD by spectrophotometer. The biggest advantages of disc diffusion method over other methods are that it is rapid, efficient, cost effective and reliable. While there are certain limitation of this methods like any variation in cell number, incubation time, diffusibility of antibacterial compound, media etc, which leads to very high variation in final results. Similarly, it gives only quantitative idea and determination of bactericidal concentration or MIC need to be back calculated. For pathological determination serum to antibiotic ratio must be taken in to consideration to evaluate effective dose. In this practical, we evaluated three cultures for its Antibiotics susceptibility against various antibiotics. Among these Pseudomonas

Saturday, August 24, 2019

SMB team on-demand CRM comparison guide by Ziff Davis Essay

SMB team on-demand CRM comparison guide by Ziff Davis - Essay Example The products can also be applied for single use or for departmental use. Each product also has a different pricing basis that ranges in price, duration of license and number of users per license. The products’ mode of sale, year of development, and ownership are also diversified. They however serve a wide customer base in the global market (Davis, p. 1).The reported Customer Relationship Management products also have differentiated features. The products’ specifications such as versions are for example distinct for each product. Similarly, the products functions are diversified and include â€Å"sales, marketing, service, workflow, system customization, offline synchronization, customer service, reporting and inventory management† functions that are either singly or jointly applicable (Davis, p. 2). The products are also identified with special features that apply to some of the functionalities. Such functions are sales, marketing.The content of the paper â€Å" SMB team on-demand CRM comparison guide by Ziff Davis† is relevant and valuable to my project that seeks to recommend application of Customer Relationship Management to Macy’s enterprises. This is because it identifies features of different Customer Relationship Management products, their functions that identify the products’ benefits, and features of every product towards its functions. It therefore helps with information that supports my project’s recommendation of a specific product to Macy’s Inc.

Protecting Young Childrens Welfare Essay Example | Topics and Well Written Essays - 2500 words

Protecting Young Childrens Welfare - Essay Example The abuse maybe inflicted by adults, caregivers, acquaintances and at times by other children also. (DoH et al 2006). The protection of a child from maltreatment constitutes a step towards the welfare of the child, and it is especially important to take care of small vulnerable children. A child is said to be vulnerable when there is no assurance of satisfactory development of health. (section 17 (10) of the Children Act 1989). It is imperative that all caregivers give paramount importance to the fact that children are being encouraged to develop in conditions which are conducive to their welfare and protection. This will ensure that they enter adulthood with great confidence and on a sure footing. The issue of safety and protection for children has been the focus of child care agencies for a long time, but the crucial role they play and their duty towards children’s welfare has been brought into the limelight by certain tragic cases wherein the child had died due to negligence on the part of the caregivers. The death of Victoria Climbe made the government sit up and take serious note of the issue. Following the report prepared subsequent to the investigations, many hard facts about child safeguard programs have come up for review. The report that was published revealed that although the Children Act of 1989 is comprehensive, it fails due to improper interpretation and inefficient implementation. The staff in Area Child Protection Committees is often hampered due to inefficient authority and lack of resources. Apart from this, very often caregivers do not know whether they should classify the children as â€Å" in need† or as a case for ‘child protection†. Thus , in order to create a system which is effective in containing this defect, organizations have to pool in their resources and come together to improve the child’s safety net. They must guess the need for protection

Friday, August 23, 2019

Aerospace business management and legislation Essay

Aerospace business management and legislation - Essay Example are some of the reasons that the Airline has stayed successful and has been able to survive the global economic problems that have befallen the aviation industry in recent times. Southwest Airlines has continued to be successful due to good financial planning on the part of the company’s administration, which has ensured that the company has continued to grow steadily over a period of more than 30 years. Although Southwest provides low-cost airline transport, new customers may assume that the airline’s services might be less professional because of low training budgets or that the airline acquire cheaper facilities for lowering of operational costs. (Butler and Keller, 2000) This can prevent the more picky travelers from trying out the airline’s services and thus prevent the airline from getting business from these travelers. Although pickier customers would not mind paying lesser amounts for tickets, the customers will probably not be willing to endure any poor equipments or inadequate services. Upon entering any new market or market sector the airline’s low fares usually stimulate demand at a fast rate. Although this stimulates higher a load factor, the airline has been able to handle increase in capacity through proper financial planning. In this light, some other airlines have been known to respond by dropping their own fares and further stimulating the total market. The airline’s financial plan also caters to the financial needs of their employees, as they are known to pay their employees well. This translates into more success for the company, as well paid employees are usually happy employees, and would possess better company morale than the employees of their competitors. The airline has been using only one type of aircraft (Boeing 737) and this strategy was intended to keep maintenance costs low, as well as lower training costs too, because the pilots, engineers and flight attendants only have to undergo training for Boeing 737 airplanes.

Thursday, August 22, 2019

Situation of the local baths Essay Example for Free

Situation of the local baths Essay I am deeply honored to correspond with the local council regarding the situation of the local baths. It is also a privilege to interact and work hand in hand with the proper authorities concerning this matter. Through this one, I can be a good servant then. A question can be asked in any moment regarding the safety of the local baths. Will there be a guaranty concerning its safety? Is it complete with safety precautions? Well, every persons primary concern in going to the local baths is safety for their life and limb. Normally, it would be the responsibility of the management of local baths to make sure that every person will be guaranteed complete safety. Indeed, it is my duty to correspond with the over-all local council in dealing with this very important matter. I just hope that my recommendations will be given enough attention for appropriate review. It is a common knowledge that high tides are a hazard to the swimmers especially during Christmas and New Year season. It will prevent them from enjoying their stay in the establishment. Others are also concerned in the availability of safety precautions in whatever kind for them to be safe. Heavy tides in the sea tend to break over the cemented barrier or any barrier found in the area that will cause a back swell causing swimmers to be dragged over the northern promenade into the ocean. Is it not dangerous? That is why effective safety precautions were incorporated into the local baths to prevent any damage. I have taken so many safety precautions then. It include placing warning signs in prominent positions and installing a safety chain across the northern area. The safety chain serves to keep the people from going too far from the northern side of the pool that is known to be dangerous. Aside from that, a loud speaker system has been built in order to regularly warn any person using the pool to avoid any damage. The loudspeaker was useful because it reaches people staying in the northern poolside. There is also an organized storage of safety and rescue equipment being placed in an accesible position. However, dangers cannot be avoided in any time. This is so because high tides in the sea cannot just be controlled by human power without exerting extraordinary efforts. The problem was not so alarming until one incident happened that made the management uneasy. In this particular year, a peculiar thing happened that causes alarm to the people. What happened was that when the north-easterly swell and winds have created very heavy surf, some people are affected by it. That happened during the last Saturday of December. At that time, I boldly announced over the loudspeaker system that waves breaking onto the poolside are very dangerous. With all diligence, I gave instructions to the people in the swimming pool area to leave the northern end of the pool. That was one way of doing all my efforts together with the other employees to avoid damages due to the surf. Despite the announcement that was given to the effect that no one should stay in the northern end of the pool, three people have refused to move from the same. They ignored the instruction that was given for their benefit. That was indeed problematic and dangerous. As I walked towards the three people, a very large wave drags them back into the dangerous surf. I was shocked but tried to control myself to be able to face the situation appropriately. In fact, I voluntarily tried to radio for the surf and rescue helicopter for immediate actions. As a result, two people are rescued without any harm except the third person who was seriously injured due to the large wave that struck him. With this point at hand, there is a need to overhaul the system of operating the local baths in the area of safety. There are so many factors that causes the injuries of the swimmers in the pool. These things may have something to do with the negligence of the management and even on the side of the swimmers. The loudspeaker system is not so effective. While it was true that the swimmers were given announcements whenever there is danger, but that was not enough. Hence, what is really needed was trained lifeguards that are present in the poolside area. Lifeguards are trained employees that are responsible for the safety of the people while present in a recreational water areas. Generally, lifeguards are the people who observe swimming activities, anticipate problems and identify emergencies, give immediate first aid and report incidents in the sea (â€Å"Lifeguards†, 2007, p. 1). Therefore, it is respectfully recommended that trained lifeguards be hired to be able to make the operation of the local baths more safe and productive. The rationale of the recommendation is to give credence to the fact that hiring additional people is more logical than allowing any serious injuries and even death to happen. The presence of the lifeguards will surely prevent any accident in the local baths whether swimmers are obedient when there are announced instructions through loudspeaker system or hardheaded for that matter. In line with these, lifeguards must be trained with respect to basic techniques, rescue, aquatic injuries first aid lessons and effective coordination during emergency situations. Aside from that, the course of their training must include observation and scanning ability in the swimming pool area. The local baths needs men to do surveillance to avoid the slightest hint of any dangerous surf and heavy winds. In effect, lifeguards can do uninterrupted supervision while swimmers are enjoying in the pool especially in the most dangerous area. Finally, for the local council to approve this recommendation, it is respectfully submitted that the management of the local baths under my leadership will fully execute the policies and regulation for a positive result.

Wednesday, August 21, 2019

The integrating earned value management

The integrating earned value management 3.3 EARNED VALUE PROJECT MANAGEMENT 3.3.1 Basics of Earned Value Project Management Project Management is often defined as the integrated management and control of Time, Cost, Resources and Quality for the successful on time and on budget completion of projects. Traditional approaches to PM ranged from simple Gantt Charts which help in representing the work to be done on a time scale to techniques likes CPM and PERT that addresses the needs of deterministic and probabilistic scheduling. All of these techniques tend to be used primarily for managing time. Cost is often measured independently by the accountants. This separation between cost and time is often the cause of project failure because the executing team is often unable to detect cost overruns until they are well past the point where they can change the outcome of the project. 3.3.2 Illustrative Explanation Earned Value Project Management (EVPM) is a concept that helps Project Managers seamless link Time and Cost for more effective control. Despite the difficult sounding title and the typical jargon associated with EVPM the basic idea is very simple and can be used effectively in a wide variety of situations. The best way to under stand EVPM is to walk through a sample project, so I am going to take you through a software project. Lets say we are working on the ERMS (enterprise resource management system) that has 10 deliverables/modules each to be completed in one month with a budget of 10 Lac Rupees each. The total project span works out to 10 months at a budget cost of Rs. 100 Lacs (Rs. 1 crore). We are at the end of the first three months and the Project Manager is busy preparing his project report. He starts up by reviewing progress and finds that two deliverables are fully complete while the third one is 80% complete. He checks with accounts and finds out that that a total of Rs. 28 Lacs have been spent so far. With this information he is ready to assign values to the three basic variables required to perform EVPM. These are as follows- 3.3.2.1 BCWS / Planned Value (PV) Budget Cost for Work Scheduled, also known as ‘Planned Value in the amount of money that should have been spent at this point in the life of the project if the project was proceeding as per plan. It is time phased budget baseline (figure). It is the approved budget for accomplishing the activity, work package or project related to the schedule. It can be viewed as the value to be earned as a function of project Work accomplishments up to a given point in time [12]. In our case we had planned to complete three deliverables in three months so we should have spent Rs. 30 Lac. A word of caution here, most projects dont proceed in a linear fashion (i.e. total budget/total duration in months). Correct BCWS values can be obtained from a resource loaded project plan that takes in account the actually work to be done in each period. Budget Cost for Work Scheduled is also called ‘Planned Value. 3.3.2.2 Budget at Completion (BAC) This is the total budget baseline for the activity, work package or project. It is the highest value of PV as shown in Figure-1 i.e. 100 Lac. 3.3.2.3 ACWP / Actual Cost (AC) This is the cumulative AC spent to a given point in time to accomplish an activity, work-package or project [12]. Actual Cost for Work Performed is the amount of money that we have actually spent on the project. Accounts have told us that we have spent Rs. 28 Lac. 3.3.2.4 BCWP / Earned Value (EV) This is the cumulative earned value for the work completed up to a point in time. It represents the amount budgeted for performing the work that was accomplished by a given point in time [12]. To obtain EV of an item, simply multiply its total budget by its completed proportion. Budget Cost for Work Performed is the assessment of the value of work that we have completed. Think of this as the worth of the work that we have completed, so if we had completed three deliverables we would have Completed 30 Lac Rupees worth of work. But we have only fully completed two deliverables so we have Rs. 20 Lac and we have 80% of the third deliverable. Partial completion is a tricky issue, because partial estimates generally vary from person to person depending on how optimistic or pessimistic they are. There are rules of the thumb (Heuristics) to deal with this situation. The common ones are 0-100 (give no credit till the task is complete), 20-80 (give 20% credit when the task is underway and the remaining 80% when it is completed), 50-50 (give 50% credit for starting the task and the balance on completion). The selection of method is up to you, but you need to ensure that you will use the same measure across the project for all tasks. In our case lets say we go with the 50-50 rule, so well give Rs. 5 Lacs credit for the third deliverable which brings the BCWP to Rs. 25 Lacs (20+5). Note that BCWP is also referred to as the Earned Value (EV). Lets start by calculating the two basic measures of performance SPI and CPI -3.3.2.5 3.3.2.5 Schedule Performance Index (SPI) Schedule Performance Index is an indicator for accessing our performance relative to the plan. SPI = BCWP/BCWS = 25/30 = 0.83. We know we are behind schedule, what SPI is telling us is that we have only completed 83% of the work that we originally planned to complete. 3.3.2.6 Cost Performance Index (CPI) Cost Performance Index shows us how much value we are getting for each Rupee that we spend on the project. CPI= BCWP/ACWP 25/28 = 0.89. We are over budget because, for producing Rs. 25 Lacs of work we have spent Rs. 28 Lacs. So we are only getting 89 Paisas of value for each Rupee that we spend. Just looking at SPI and CPI we know that we have a problem in that we are both over budget and behind schedule. A lot of work has been done on the use of SPI and CPI early in the project to predict the final outcome. Most of the work has been done in the US defense industry where researchers have looked at dozens of completed projects and tried to correlate their outcome with the status of their SPI and CPI early on during the project. Most studies show that the value of SPI and CPI when the project is only 20% complete can very accurately predict the final outcome. Using heuristics developed from these studies we can predict the following- Projected Project Duration = Planned Duration / SPI = 10 / 0.83 = 12 Month. So we are expecting that the project will be completed two month behind schedule, Projected Project Cost = Planned Cost / CPI = 100 / 0.89 = 112 Lacs. We are expecting a Rs. 12 Lacs overrun on the budget. Recovery Cost This is the cost that we will incur if we need to complete the project within the originally specified time by adding additional resource to the project. Projected Project Cost = Planned Cost / CPI*SPI = 100 / 0.89*0.83 = 135 Lacs Rupees. We should be ready to exceed the budget by 35% if we want to complete the project in time. Conventional wisdom says that your ability to change the outcome of a project is maximum at the start or the project and minimum near the end of the project. So it makes good sense to detect problems early and take action when you have room for maneuver. If you think about the 20% point intuitively, youll note that the any estimation errors that are leading to low CPI (i.e. budget overrun) are likely to effect the remaining activities of the project at the same rate, similarly the performance of your resources in execution is unlikely to get any better than what they have proven capable of in the first fifth of the project. Given the importance of early detection, think about conventional project management and how little it can tell you from the fact that you have completed two deliverables and 80% of the third and spent 28 Lac Rupees, Because of this, problems often evade early detection and by the time someone detects the problem its too late in the project to do much about it i.e. in a stage where the project is controlling the project manger instead of vice versa. 3.3.4 Integrating EVM Risk Management In todays uncertain business environment there is understandable pressure to improve the quality of decision-making at all levels in the organization. A number of techniques have been developed to address this concern, in an attempt to introduce some rational framework to the decision-making process. Two of the leading approaches are Earned Value Management (EVM) and Risk Management (RM). These stand out from other decision support techniques because both EVM and RM can and should be applied in an integrated way across the organization. Starting at the project level, both EVM and RM offer powerful insights into factors affecting project performance. Another key similarity between the two techniques lies in the word â€Å"management†. It is possible to conduct â€Å"Earned Value Analysis† and â€Å"Risk Analysis† to expose underlying drivers of performance. But both techniques emphasize the need to move from analysis to management, using the information to support proactive decision-making. Consequently, both EVM and RM encourage those using the techniques to take appropriate management action based on the results, and not to stop at mere analysis. Since both EVM and RM address the same problem space (performance of projects, programs, portfolios and businesses), and both provide management information to provide a basis for decisions and action, there has been considerable interest in the possibility of developing a combined approach to create synergistic benefits. Currently EVM and RM operate as parallel coexisting processes without systematic integration (although good project managers may intuitively link the two in practices). Much of the discussion to date on the relationship between EVM and RM has been rather theoretical, addressing the key principles underlying the two techniques. The objective is to analyze steps that can be implemented to combine EVM and RM in order to gain maximum benefit for projects and the organization. 3.3.4.1 Weakness in EVM and RM The strength of EVM RM has already been described, as their proponents seek to encourage wider update use. Each technique however has atleast one key weakness which presents a significant danger to those relying on the output to support strategic or tactical decision-making. For EVM, one of the main perceived weaknesses is its reliance on a key assumption, that future performance can be predicted based on past performance. Calculated performance measures (CPI, SPI, CV, SV etc) are used to predict forwards and estimate cost at completion or overall duration. Unfortunately there is no guarantee that the basic EVM assumption will be true, and it is likely that the future will deviate from that predicted by simply extrapolating from past performance. The strength of EVM lies in its rigorous examination of what has already occurred on the project, using quantitative metrics to evaluate project past performance. It goes on however to predict future performance by extrapolating from the past. But it is not possible to drive a car by only looking in the rear-view mirror. A forward view is also required, and this Is what RM offers. While project planning looks at the next steps which lie immediately ahead, RM has a horizon further into the future, It acts as forward-looking radar, scanning the uncertain and unclear future to identify potential dangers to be avoided, as well as seeking possible additional benefits to be captured. However this undoubted strength of being resolutely and exclusively future-focused is also one of the key weaknesses in RM. Any thing which occurred in the past is of little or no interest to the risk process, since there is no uncertainty associated with past events. RM starts with todays status quo and looks ahead. How the project reached its current position is not relevant to the risk process, unless one is seeking to learn lessons to assist RM on future projects. As a result RM as commonly implemented often lacks a meaningful context within which to interpret identified risks, since it has no means of capturing past performance and feeding this into the decision-making process. If EVM is weakened by assuming that future performance can be predicted from past performance, and if RM is weakened by looking only forwards with no real awareness of the past, a useful synergy might be obtained if a combined EVM-RM approach were able to address these weaknesses. Combining a rear-view mirror with forward-looking radar would use the strengths of complementary approaches to compensate for the weaknesses inherent in using each alone. Consequently it is possible to produce significant benefits by using RM to provide the forward view required by EVM, and by using EVM to provide the context required for RM. 3.3.4.2 Synergies from a Combined Approach Given the common aims of EVM and RM to examine and expose drivers of project performance in order to focus management attention on achievement of objectives, and given their differing perspectives towards the past and the future, a number of areas of possible synergy exist between the two techniques. The steps required to implement these synergies are [18]: Creating the baseline spend plan Predicting future outcomes Evaluating risk process effectiveness 1. Creating the baseline spend plan The foundation for EVM is the baseline plan of expected spend over time, creating the profile of â€Å"Budgeted Cost of Work Scheduled† (BCWS) or â€Å"Planned Value† (PV) against which project performance is measured. This baseline is derived from a costed and resourced project plan, including fixed and variable costs arising from financial and human resources. The BCWS profile is typically presented as a cumulative curve, or S-curve, as in Figure below. The baseline BCWS exists as the benchmark against which project performance will be measured. However one of the first things a project manager learns is that reality will never precisely match the project plan. As soon as work starts, there are variations in productivity, resource and information availability, delivery dates, material costs, scope etc. This is why a rigorous change control process is vital to successful project management. Although not all changes can be foreseen before the project starts, it is possible to assess the degree of uncertainty in a project plan, in both time and cost dimensions. This is the domain of RM. One of the first contributions that RM can make to EVM is to make explicit the consideration of uncertainty and risk when constructing the baseline BCWS. By undertaking a full risk assessment of the project plan before the project starts, addressing uncertainties in both time and cost, it is possible to evaluate the degree of risk in the baseline project plan. Quantitative risk analysis techniques are particularly useful for this, especially the use of Monte Carlo simulation on integrated models which include both time and cost uncertainty. These risk models take account of variability in planned values, also called â€Å"estimating uncertainty† (for example by replacing planned single-point estimates of duration or cost with three-point estimates or other distribution types), and they should also model the effect of discrete risks to reflect their assessed probability of occurrence and the subsequent impact on project time and/or cost. Both threats and opportunities should be addressed in the risk model, representing the possibility of exceeding or failing to meet the project plan. The results of the risk analysis allow the best case project outcome to be determined, representing the cheapest and. quickest way to reach project completion. Similarly a worst case profile can be produced, with highest cost and longest duration. All other possible outcomes are also calculated, allowing the â€Å"expected outcome† within this range to be identified. These can be shown as a set of three related S-curves, as in Figure below, which take account of both estimating uncertainty (variability in planned events) and discrete risks (both positive opportunities and negative threats). The ellipse at the end of the curves represents all possible calculated projects outcomes (90% confidence limit), with the top-right value showing worst-case (highest cost, longest schedule), the bottom-left giving best-case (cheapest and Quickest), and the centre of gravity of the ellipse being at the expected outcome of project cost and duration. The existence of this set of possible project outcomes raises the question of where the baseline spends profile for EVM should be set. The recommendation from a combined approach to EVM and RM is to use the expected value cumulative profile from a quantitative time-cost risk analysis as the baseline for BCWS. In other words, the central S-curve in Figure 2 would be used as the baseline instead of the S-curve in Figure 1. This ensures that the EVM baseline fully reflects the risk associated with the project plan (including an appropriate amount for contingency which is automatically incorporated in the risk analysis results), rather than measuring performance against the raw â€Å"all-goes-to-plan† plan. 2. Predicting future outcomes Both EVM and RM attempt to predict the future outcome of the project, based on information currently known about the project. For EVM this is achieved using calculated performance indices, with a range of formulae in use for calculating Estimate At Completion (EAC). Most of these formulae start with the Actual Cost of Work Performed to date (ACWP, or Actual Cost AC), and add the remaining budget adjusted to take account of performance to date (usually using the Cost Performance Index CPI, or using a combined Performance Efficiency Factor based on both CPI and SPI). These calculations of the Estimate To Complete (ETC) are used to extrapolate the ACWP plot for the remainder of the project to estimate where the project might finally end (EAC), as shown In Figure 3 below. RM predicts a range of possible futures by analyzing the combined effect of known risks and unknown uncertainty on the remainder of the project. When an integrated time-cost risk model is used, the result is a set of S-curves similar to Figure 2, but covering the uncompleted portion of the project, as In Figure 4. It is also possible to use risk nalysis results to show the effect of specific risks(threats or opportunities) on project performance as measured by earned value. Since the risk analysis includes both estimating uncertainty discrete risks, the model can be used to perform â€Å"what-if† scenario analysis showing the effect of addressing particular risks. 3. Evaluating risk process effectiveness A risk can be defined as â€Å"any uncertainty that, if it occurs, would have a positive or negative effect on achievement of one or more project objectives†. RM aims to address this uncertainty proactively in order to ensure that project objectives are achieved, including completing on time and within budget. As a result, if RM is fully effective, actual project performance should closely match the plan. Since EVM performance indices (CPI, SPI) measure deviation from plan, they can be used to indicate whether the risk process is being effective in addressing uncertainty and controlling its effects on project performance. If CPI and/or SPI are below 1.0 indicating that project performance is falling short of the plan, then one of the most likely underlying causes is that the risk process is failing to keep the project on course. An Ineffective risk process would fail to avoid adverse risks (threats) proactively, and when threats materialize into problems the project incurs delay and/or additional cost. Either the risk process is not identifying the threats, or it is not preventing them from occurring. In this situation, management attention should be directed to the risk process, to review its effectiveness and consider whether additional resources are required, or whether different techniques should be used. Conversely, if CPI and/or SPI are above 1.0 indicating that project performance is ahead of plan, the risk process should be focused on exploiting the opportunities created by this situation. Best-practice RM addresses both threats and opportunities, seeking to minimize threats and maximize opportunities. When EVM indicates that opportunities exist, the risk process should explore options to capture them and create additional benefits for the project. It should also be noted that if CPI and/or SPI far exceed 1.0, this may indicate other problems in the project and may not simply be due to the existence of opportunities. Typically, if actual performance is much greater than expected or planned, this could indicate poor planning or incorrect scoping when setting up the initial baseline plan. If this highly anomalous behavior continues, a baseline re-planning effort should be considered, which of course will involve the need for further risk management. Similarly if CPI and/or SPI are well below 1.0, this may not simply be due to the impact of unmanaged threats, but may indicate problems with the baseline plan or scope. Figure 5 illustrates the relationship between the values of EVM indices (CPI and/or SPI) and RM process effectiveness. The key to using EVM indices as indicators of RM effectiveness is to determine appropriate thresholds where action is required to refocus the risk process. Clearly some variation of EVM indices is to be expected as the project unfolds, and it would not be wise to modify the risk process in response to every small change in CPI /or SPI. However if a trend develops crosses the thresholds of â€Å"common variance†, action should be considered. Figure 6 illustrate this, with the thresholds of â€Å"common variance† for CPI /or SPI set at = 0.9 and =1.25. A further â€Å"warning threshold† is set at 0.75, suggesting that an adverse trend is developing and preparatory steps should be taken. The thresholds of 0.75, 0.9 and 1.25 used in Figure 6 are illustrative only, and organizations may be able to determine more appropriate threshold values by reviewing historical trend data for CPl and SPI, and identifying the limits of â€Å"common variance† for their projects. Plotting the trend of CPI and SPI over time against such thresholds also gives useful information on the type of risk exposure faced by the project at any given point. For example Figure 6 Indicates that the project schedule is under pressure (SPI trend is consistently below 1.0), suggesting that the risk process should focus on addressing sources of time risk. The figure also suggests that cost savings are possible which might create opportunities that can be exploited, and the risk process might be able to maximize these. These recommended action types are illustrated in Figure 7, corresponding to the following four situations: Both CPI and SPI high (top-right quadrant), creating opportunities to be captured Both CPI and SPI low (bottom-left quadrant), requiring aggressive action to address threats High SPI but low CPI (top-left quadrant), requiring focused attention to cost risk, with the possibility of spending additional time to address High CPI but low SPI (bottom-right quadrant), where attention should be paid to addressing schedule risk, and cost trade-offs can be considered Figure 7 also suggests that if either CPI or SPI (or both) remain abnormally high or low, the baseline plan should be re-examined to determine whether the initial scope was correct or whether underlying planning assumptions were unfounded. It is important to note that these action types should be viewed only as 1st options, since other considerations may lead to different actions. For example in projects with high schedule-constraints (e.g. product launch, event management etc), the trade-off between time cost may be prioritized differently than in cost-constrained projects. 3.3.4.3 Discussion Both Earned Value Management (EVM) and Risk Management (RM) seek to improve decision-making by providing a rational framework based on project performance. EVM examines past performance against clearly-defined quantitative metrics, and uses these to predict the future outcome for the project. RM looks ahead to identify and assess uncertainties with the potential to affect project performance either positively or negatively, and develops responses to address each risk proactively. Both techniques share a focus on project performance, and have the same purpose of developing effective actions to correct unwelcome trends in order to maximize the Likelihood of achieving project objectives. One (EVM) does this by looking back at past performance as an indicator of likely future performance. The other (RM) looks ahead at possible influences on future project outcomes. These two approaches are not in conflict or mutually exclusive. Indeed their commonalities imply a powerful synergy, which i s available through combining the complementary strengths of each technique and using insights from one to inform the application of the other (as summarized in Table 5). 1. Creating the baseline spend plan (BCWS/PV) Develop costed WBS to describe scope of work, without hidden contingency Produce fully costed and resourced project schedule Assess estimating uncertainty associated with initial time/cost estimates Perform risk identification, risk assessment and response development Quantify time and cost risk exposure for each risk, taking account of the effect of agreed responses Create integrated time/cost risk model from project schedule, reflecting both estimating uncertainty (via 3-point estimates) Select risk-based profile as baseline spend profile (BCWS/PV); it is most common to use the â€Å"expected values†, although some other confidence level may be selected (say 80%) 2. Predicting future outcomes (EAC) Record project progress and actual cost spent to date (ACWP), and calculate earned value (BCWP) Review initial time/cost estimates for activities not completed, to identify changes, including revised estimating uncertainty Update risk identification, assessment and quantification, to identify new risks and reassess existing risks Update integrated time/cost risk mode! with revised values for estimating uncertainty and discrete risks, taking account of progress to date and agreed risk responses Select risk-based calculation as estimate of final project duration and cost (EAC), using either â€Å"expected values†, or some other confidence level (say 80%) Use risk-based profile as updated expected spend from time-now to project completion 3. Evaluating risk management process effectiveness Determine threshold values for CPI and SPI to trigger corrective action in risk process (or use default values of 0.75, 0.90 and 1.25) Calculate earned value performance indices (CPI and SPI), plot trends and compare with thresholds Consider modifications to risk process if CPI and/or SPI cross thresholds, enhancing the process to tackle opportunities more effectively if CPI and/or SPI are high, or refocusing the process on threat reduction if they are low Take appropriate action either to exploit opportunities (high CPI/SPI), address threats (low CPI/SPI), spend contingency to recover time (high CPI/low SPI), or spend time to reduce cost drivers (high SPI/low CPI) Consider need to review initial baseline, project plan or scope if CPI and/or SPI persistently have unusually high or low value Table-5: Summary of steps to integrate EVM and RM http://support.sas.com/documentation/cdl/en/orpmug/59678/HTML/default/evm_sect3.htm http://www.tech-archive.net/Archive/Project/microsoft.public.project/2007-05/msg00594.html http://www.allpm.com/index.php?name=Newsfile=articlesid=2184 http://www.allpm.com/index.php?name=Newsfile=articlesid=2184theme=Printer http://www.freepatentsonline.com/4424121.html http://www.ustreas.gov/tigta/auditreports/reports/094602fr.html http://www.ustreas.gov/tigta/auditreports/reports/094602fr.html http://www.dcma.mil/communicator/spring_summer04/evm.cfm http://www.parm.com/index.php?option=com_contenttask=viewid=171Itemid=35 http://www.parm.com/index.php?option=com_contenttask=viewid=177Itemid=35 http://www.scribd.com/doc/4614499/Project-Performance-Measurement http://www.baz.com/kjordan/swse625/htm/tp-py.htm http://guidebook.dcma.mil/79/evhelp/what.htm http://www.acq.osd.mil/pm/faqs/evbasics.htm http://commdocs.house.gov/committees/bank/hba57680.000/hba57680_0.HTM http://www.allbusiness.com/management/risk-management/8945762-1.html http://www.parm.com/index.php?option=com_contenttask=viewid=171Itemid=35 http://www.bcs.org/server.php?show=ConWebDoc.5912 http://www.plannerlife.info/2007/07/what-is-earned-value.html http://evm.nasa.gov/tutoriala.html Earned Value Management (EVM) Tutorial http://glossary.tenrox.com/earned-value.htm http://support.sas.com/documentation/cdl/en/orpmug/59678/HTML/default/cpm_sect42.htm http://www.ombudsman.forces.gc.ca/rep-rap/sr-rs/sf-ts/rep-rap-02-eng.asp http://www.freepatentsonline.com/6006060.html http://www.pmi.org/Search/GenieRedirector.aspx?genieID=6685iss=1 http://www.pmi-cpm.org/pages/events/IPM06/Practice_Symposia.html http://edgar.sec.gov/about/oig/audit/337fin.htm http://www.scribd.com/doc/4614499/Project-Performance-Measurement http://www.acq.osd.mil/pm/faqs/evbasics.htm http://www.baz.com/kjordan/swse625/htm/tp-py.htm http://www.parm.com/index.php?option=com_contenttask=viewid=177Itemid=35 http://www.dcma.mil/communicator/spring_summer04/evm.cfm http://guidebook.dcma.mil/79/evhelp/what.htm

Tuesday, August 20, 2019

How Important Is Race In US Politics?

How Important Is Race In US Politics? I am the son of a Black man from Kenya and a White woman from Kansas, asserted Barack Obama in 2008; and for as long as I live, I will never forget that in no other country on Earth is my story possible. This speech came in March 2008; until this point candidates on both sides had avoided discussing race as an issue. Obama wished to establish himself as a candidate outside of race, yet ultimately this was not possible. Those opposed to this strategy ensured that race remained an integral factor in the 2008 election and the wider US political scene. Race can be seen to link to a variety of policy areas. For example, a recent New York Times article states that, four in 10 Black children are born into poverty [while] less than one in 10 White children are.  [2]  Statistics such as this demonstrate that race in connection with economics and class are central issues for US politics more generally. The Center for Disease Control and Prevention compiled an extensive report in January 20 11 detailing racial disparities in a broad array of health problems; highlighting the continual significance of race as a policy issue, particularly in popular discourse, as this report received much mainstream media attention.  [3]  Yet the subsequent issues raised by race have changed. No longer are blatant displays of racism socially or legally acceptable; so what is preventing us from deeming America a post-racial society?  [4]  Furthermore, why is a post-racial society the aspiration? On the one hand, it falls in line with the American principle of a united nation, yet on the other hand it is considered dangerous to attempt to embrace different cultures, after so many years of segregation. Furthermore, the absence of overt discrimination does not mean that exclusion has ended, rather, that the character of [such] discrimination has changed.  [5]   This paper shall argue the continuing importance of race in US politics, both through its overt influence on policy making as well as its implicit influence; as often discussions which avoid race are making an equally important statement. This essay shall consider race largely in terms of traditional binaries of Black and White. Incorporating an analysis of ethnicity will be too broad, particularly as language and immigration would need to be considered. Obamas election as a mixed-race American has brought traditional binaries of Black and White back to the forefront of discussions. While other minority groups do add another layer of complexity, analysis of this goes beyond the scope of this essay. A further constraint has meant that race will be discussed with regard to the domestic, and in particular, on a federal level. Historically race issues differed between states, and while there may still be variation regarding perceptions between more conservative or more liberal states, a discussion of federal policy regarding race will allow wider conclusions to be drawn. Race can be considered to encompass issues of governmental policy, party policy, public perceptions and political strategy. If race is viewed in this way it is possible to attempt to separate political and legislative conceptions of race from discussions of individual discrimination. The former is the focus for this paper. All of these factors become heightened during election years, where race continues to divide people, even within the same party. Notably, the emergence Tea Party faction on the Republican side, a platform for conservative populist discontent demonstrates views held are not true of all of the Republican Party; furthermore, it may not necessarily represent views of all Tea Party movements, as there is no single Tea Party.  [6]  It is the nature of US political parties to encompass huge variation within the main parties. Election years provide an increased awareness of the political, and as such will provide recent examples of the trends in racial politics. This c an be seen presently through the debate over Obamas place of birth; with the administration choosing to release the long birth certificate before the next election cycle. The argument will proceed through three substantive sections; firstly a discussion of race and the electorate; second, the factors which continue to shape racial inequality; and finally governing, including an analysis of candidate choice. Race and class Recent shifts in the American demographic are crucial to understanding how race as an issue has changed, particularly in the post-Bush era. This change in administration provided substantive change in some areas, but arguably not during the 2008 campaign period. Minorities did, and will, continue to be a secondary concern while White voters retain plurality status, this notion featured prominently in the 2008 general election, as voting statistics suggest parties will continue to bring White issues to the forefront in order to win elections. It can be seen that there is a glaring ideological disconnect between the desire and reality of a race-free society.  [7]  Teasley and Ikard, in their article The Myth of Postracism suggest the danger of complete investment in postracial thinking, particularly for the most economically vulnerable African American population. The prominent liberal view of racial policy suggests a cautious approach, favouring the idea of a colour blind society. While it is suggested that there is no currently viable alternative to a liberal vision of race, it can be inferred that at least racialism as a theory acknowledges the persistence of racism in America.  [8]  Critical Race Theory (CRT) favours a race conscious approach, reliant upon political organisation. In arguing the need for CRT, Metzler presents an argument for why the term postracial is meaningless as a critique.  [9]  Usefully, the theory also allows for intersections between race, class and sex. A basic premise is that while electoral decisions may claim to be colour blind they are actually steeped in racist ideology.  [10]  For example, it can be argued that race as a political factor will be avoided as much as possible; unless a politicians political survival depends on it. Example 2008 or Sotomoyar The premise of a race neutral campaign is to develop a coalition of support, regardless of race. Yet ultimately, there remains a divergence as to why different r acial groups voted for Obama, while many Whites voted for Obama as a way to move beyond race, many Blacks voted for him as a way to vindicate the entire Black race.  [11]  While the term vindicate may be unnecessarily emotive, the notion of collective Black support for Obama is significant as an example of the continual relevance of race in US electoral politics and the differing motives for voting behaviour. The mere suggestion that issues still exist as Black or White demonstrates the continuing importance of race. After his 2000 election victory, Bush was famously advised that if he did not improve his minority vote, he would be unable to win the next election. The pattern of immigration in the US has left the country with a large multicultural demographic. The nature of such immigration, being both forced through slavery, and voluntary, is a relatively unique phenomenon; as such racial issues are historically rooted in much of American society. This seems to make some of the electorate, particularly minorities, more inclined to talk about race, while often having the opposite affect among White voters. Due to the growth in minority populations, there has been a proportional decrease to the White population. In 2008, the Black population alone comprised 12.8% of the population. The national census of 2010 puts this original figure at 12.6%, demonstrating a further demographic shift, wi th African Americans no longer comprising the majority minority, with the Black population comprising a smaller proportion than other minority groups.  [12]  This suggests a limit to traditional oppositional binaries of Black vs. White, with new minorities gaining ever increasing populations, and in theory increasing significance. Binaries remain important, but it is important to realise that they do not always give us a complete picture; as an increasing number of citizens describing themselves as multi-racial, 3.4% in the last census.  [13]  However, in the 2008 election focus was not given evenly to each minority group. Perhaps because some minority groups are more valuable when translated into votes, or perhaps because some groups are more politically active than others. South Asian voters had a huge impact on the democratic primaries in 2008, particularly in California, yet the binary view continued to dominate discussions. This may have been a temporary fluctuation, en couraged by the race between a Black candidate and a White candidate for the presidency. Yet it seems that the trend is actually a continuation from a longstanding history of dealing in terms of Black and White issues of race. The Black community also remains much more vocal than other minority groups, particularly more assimilated Asian voters. Ultimately, binaries remain useful in demonstrating the importance of race in contemporary politics, as it remains that conflicts between Black issues and White issues are at the forefront of debate, particularly with regard to healthcare and education. In the 1990s, Bill Clinton restored the Democratic Partys competitiveness by mostly avoiding the race-specific rhetoric and policies that had helped drive disaffected White voters toward the Republican opposition. The boom he presided over produced political and economic benefits for African-American families as well as well as Whites, making him popular with both groups. Comparing this to the situation in 2008; a deeper analysis of Obamas poll numbers [à ¢Ã¢â€š ¬Ã‚ ¦] indicates that very little changed in terms of voting habits in this election cycle.  [14]  Thus, on the surface it seems significant that Obama has successfully reproduced the polling numbers of Clinton, a White Southerner, adding substance to the post-racial argument.  [15]  Yet in context, Obama was nominated during a period of frustration with the outgoing Republican administration, at a time of economic uncertainty, and when a generation of African Americans had won elected office. Thus it can be seen how race as a domestic issue is closely interlinked with other factors, notably feeling toward the outgoing administration and the fluctuating state of the economy. As such, examples which are often cited to demonstrate the decreasing significance of race can actually be at least partially attributed to other factors. The socioeconomic divide in America is expanding; in a multiracial society where the races are unequal, there will often be a racial dimension to class differences, for class is an efficient recoder of racism.  [16]  Reed suggests that this is an historical trend, built on the back of enslaved Africans, as such, for Reed, race and colour have always been the ultimate determinants of socioeconomic status.  [17]  Yet it seems that the greatest divide came long after the period of reconstruction. Since 1970 the socioeconomic divide has become more evident; while the status of the most disadvantaged members of the minority population deteriorated, that of the advantaged has notably improved.  [18]  This is clearest in relation to the Black American population. As such, it seems that race becomes less important because of socioeconomic factors. The rate of improvement is also notable; in several areas, such as college attendance, Blacks [à ¢Ã¢â€š ¬Ã‚ ¦] have made those improv ements at a relatively faster rate than the reported progress of comparable Whites.  [19]  Thus, the fact that the number of Blacks enrolled full-time at colleges and universities nearly doubled between 1970 and 1980 (to over 1 million) demonstrates that there is a growing economic schism between lower-income and higher-income Black families, with the lower members of the community being left behind.  [20]  Policies such as affirmative action enhance this trend, doing more for the more advantaged members of Black communities compared to those from lower incomes. With race being so closely connected to socioeconomic conditions, it only increases its importance as a factor in US politics as the subject becomes broader. Hooks divides the Black community into class groups, and suggests that this has a considerable impact on perceptions by both the Black and the White community.  [21]  The impact of this has been to divide the Black community into sub groups, with many of the higher-income families becoming increasingly assimilated with the White community. As a basic concept this is still relevant, yet much of what Hooks outlined has become dated. Hooks argued that class-based racial integration disrupted what he terms, racial solidarity in essence; that previously class standing was irrelevant to the Black community, but increased integration has erased this bond between communities.  [22]  While it can be accepted that there did exist a sense of community, it is not true to say this has diminished to the extent which Hooks believed. Significantly, it seems the nomination of Obama reignited a sense of Black community; the mobilisation of Black voters can be attributed to a growing sense of g roup consciousness and empowerment.  [23]  Yet what is more convincing, is Hooks argument regarding communities. The emergence of what has been termed, a Black middle class, has led to wealth being removed from communities, leaving the poor and underclass as isolated segregated communities.  [24]   Race and the media It is important to consider whether race can continue to be discussed independently, or if class is now a more important issue. It seems the two issues are, and have been, fundamentally intertwined, due to long standing inequalities linking back to before the Civil War. However, the extent of this has changed, and the emergence of Black middle class has led class to move toward the forefront of political discussions of race. It is significant to discuss how and why the public produce conceptions of race, with particular emphasis on the role of the media as a source. As a nation, America emerged from a unique system of oppression and slavery. As such, race remains deeply rooted in the lives of many Americans. In an age where post-racial politics seems to be a common aspiration, for electoral benefit as much as for issues of equality, it is important to realise how race continues to appear on the political agenda. Some significant events can be cited in contemporary American politics a s periods of change. The terrorist attacks of September 2001 permanently altered American domestic and foreign policy and new issues of race were raised with the growing politicisation of Islamophobia. More recently, with regard to the binaries discussed so far, came the political impact of hurricane Katrina in 2005. Worst affected by this disaster was the city of New Orleans, which had a substantial Black population. The suffering of the people of New Orleans allowed the Democratic Party to establish itself as an alternative; and allowed it to distinguish itself through the racial politics of hurricane Katrina. It gave the Democrats the chance to put race onto the political agenda, yet arguably Obama attempted to distance himself from this strategy. This task was made easier for the Democrats due to a period of highly publicised racial shaming. By the time of the 2008 election, the Democrats were seen as a viable alternative to the Republican Party, who were famously said not to ca re about Black people.  [25]  This quote from an influential Black performer became a popular sound bite, demonstrating the importance of the media, and as such it did much to contribute to Bushs unpopularity. Following hurricane Katrina, many people sought to answer the question of whether its social effects and the government response to the countrys biggest natural disaster had more to do with race or class. Or if again, they were unavoidably linked. An argument surrounds the prominence race received as a factor in the Katrina disaster. While liberals could be accused of citing race in an attempt to reference a wider, more historic discrimination against Blacks, it was not an effective strategy as it did little to alter government policy. Therefore, although addressing Katrina as a race issue had a profound effect on the electorate, it was only later that it began to really influence policy. However, it seems that concluding class to be a more significant factor, is to divert attention away from race, thus [discouraging] a deeper discussion about the ways race and class intertwine.  [26]  Ultimately, Katrina is a prime example of the intrinsic way race and class are intertwined, largely due to the historical nature of racism in the region, in relation to housing and neighbourhood distribution. Few comparisons were made between White and Black residents, but as Lavelle and Feagin suggest; only 17% of Whites lacked access to a car to evacuate with, compared to 60% of Black residents.  [27]  Media images showed nearly all those left suffering in New Orleans were Black Americans, making it seem like a race issue; however those in more financially stable positions were able to live in safer areas, those families most able to afford homes in safer flood-protected areas and that had resources to evacuate easily suffered much less than poorer families, seemingly suggesting a class issue. Furthermore, what is also significant about the Katrina example is the way in which the media reported the story. The media are one of the most effective methods of communication across the US, as such, what is reported is highly influential among the electorate; It is universally accepted that mass media hold great power, as they transmit information to the public and are free to highlight certain news items and ignore others, setting the agenda of public life and creating consensus or disagreement on certain issues.  [2 8]  However, it took until September, a month after the disaster, for the media story to shift from stories of Black crime to the failures of government in mediating the disaster. Representation of race in the media has often contained rigid stereotypes, particularly with regard to the traditional binaries. This becomes increasingly problematic when it is considered that the portrayal may equate to the only contact a member may have with a particular racial group. In a study carried out by Johnson, he highlighted this dilemma; questioning the consequences; If somebody is living in Boston, and all their information on Black Bostonians comes from the media, what does that look like?  [29]  Johnson purported that White owned media in Boston tended to report more according to stereotypes than the Black owned media. While it was the Black owned agencies that were said to carry more positive stories, alongside the negative. Conducting a follow up to this initial study in the 1980s, Johnson looked at the distribution of coverage at the turn of the century. While crime stories continue to top the kind of coverage given to African Americans, the percentage of this coverage has dropped. Among Black owned media, education stories became central, with crime stories being placed much lower. Perceptions are crucial in politics, particularly in such a vast nation as America. Kellstedt suggests that there is a lack of substantive evidence supporting the notion that media coverage of race actually affects public opinion in any systematic way, yet he goes on to assert that it is an underlying assumption that the media has helped shape the course of race politics.  [30]  Due to their communicative role, the way the media chose to relay stories, or even the choice in stories they portray, have a profound impact in the electorate; there is a discourse of racism that advances the interests of Whites and that has an identifiable repertoire of words, images, and practices through which racial power is applied.  [31]  However, although the media still dominate communication, candidates are having an increasingly close relationship with the electorate through mediums such as social networking. As such, it seems candidates are getting increasing access to the electorate, thus racial issues can be dealt with or avoided, as the candidates chose. Of course this is relative, and the media will always retain the power of scrutiny, as is the nature in a liberal democracy. If the media shape the political agenda in the aforementioned way, then what constitutes a racial issue? It seems any number of issues could constitute a racial issue. For example, with regard to education the percentage rates of high school graduates can be used to summarise that Black students are still failing at an alarming rate compared with White students.  [32]  Or with regard to housing; although overt discrimination is no longer practiced, other practices still take place to isolate minorities from the housing market. Issues regarding joblessness, healthcare and criminal justice all continue to disproportionally affect people of colour. But what is important to question is whether these issues should be framed as racial issues, or whether this in itself is an acknowledgement of a continual inequality. If the nation were to truly adopt colour blind policies, then the theory would suggest that issues should be discussed independently from race. As such, a policy about the environment should be isolated, even though it may impact disproportionately communities of colour. But if these issues are no longer treated in isolation, politicians can be accused of pandering towards affirmative action, which is still viewed sceptically by many of the electorate. To acknowledge that so many political issues can become issues of race acknowledges the uniqueness of race as an issue area. Whether the trade-off between isolating policies is acceptable, or desirable, gets to the heart of racial policy in the USA. Is Race neutrality possible? Finally, it will be useful to look at the last presidential campaign, in order to discuss whether it can be concluded that the campaign was race neutral, and why this may have been an aspiration to so many candidates.  [33]  While this may have been the intention, partisan strategy among other factors, ensured that race was not allowed to remain off the political agenda. This continued to be true in light of the 2010 midterm elections, even though Obama was not on the ticket, much discussion among the media was once again given to his African American status and the impact this would have. Race can be seen to affect politics both implicitly and explicitly. Candidate choice was undoubtedly the aspect of race occupied most by the media. The beginning of the campaign was in line with the notion of an inclusive America. Yet the campaign shifted with the widespread circulation of Reverend Wrights sermon, in which he controversially said, the government lied about weapons of mass destruction in Iraq being a threat to the United States peace.  [34]  Obamas candidacy became very clearly race bound. At this point Obama had to justify himself in racial terms, which was a strategy which had been avoided until this point. The significance of what has since been termed Obamas race speech in 2008 is not just that it was the first point in the campaign that race became openly discussed, but it is rather what Obama did in this speech, he acknowledged the continual tensions; a part of our union we have yet to perfect.  [35]  He demonstrated that he was a clear personification o f both oppositional binaries, while offering an ability to transcend them. Black support can be seen to be both a strength and weakness for democrats. Black support can alienate other groups of voters, traditionally there have been tensions between the Black and Hispanic communities.  [36]  De-racialization is seen during campaigning when candidates attempt to avoid explicit references to race issues, in attempt to remain inclusive. Concurrently, candidates use implicit strategies such as using racial symbolic Black and Latino faces in their literature; while putting increased emphasis on issues which are perceived to be racially transcendent and ultimately, attempt to appeal to a broad selection of the electorate.  [37]  Thus even when race is not vocalised during an election, it still plays a vital role. Charles Hamilton first proposed a race neutral strategy in 1973,  [38]  Obama has been said to follow such strategy, in that he did so much to avoid discussing race as an aspect of his campaign.  [39]  The supposed advantage of such avoida nce is to encompass the widest possible selection of the electorate. With regard to voting behaviour, issues need to be directed at those who will provide the swing vote. There are limitations on the political power and influence of minorities, thus making it rational for parties to focus on the White majority, and to use deracialisation strategies.  [40]  For example, Democrats traditionally receive a disproportionate share of the minority vote, as such, it is in their interest to direct policy to White issues because they can rely on receiveing Black votes regardless; thus, the United States has racially polarized politics while race, itself, is depoliticised.  [41]  The running of a race neutral campaign is an acknowledgement in itself of the importance of race. If accepting that Black and White voters continue to prioritise different issue areas, it is also true to acknowledge that the running of a race-neutral campaign can be difficult to balance. The aim is to attract White voters without losing a connection to the Black community. Race will continue to be an issue even after the election period ends. With regard to Obama, it is again a new phenomenon; if it assumed that the race neutral campaign will extend to an attempt at race neutral governing, then Obama will continue to avoid the issue of race. This has been seen through the first half of his first term. Race issues are not overtly mentioned unless completely necessary. However this is not due to Obamas skilled pragmatism. In fact, it seems Obama may be constrained by those who elected him in the first place, as well as by the partisan tactics of the Republican opposition. If Obama had mounted a concerted series of racial policy issues, then White voters may have felt alienated. A lot of the early fears from the campaign would be perceived to be correct: for example that Obama was an African American; interested in prioritising minority issues. This goes against the intended pluralistic nature of US politics. Reed claims that Americans will have to mount a concerted effort to have Obama promote anything regarded as a Black issue.  [42]  Thus, have African American issues actually been side-lined and consequently jeopardised through the election of the countrys first Black president. If Obama does continue to down play racial issues, Conservative arguments declaring the irrelevance of race will be strengthened.  [43]  Conservatives use Obamas image as a sign that racism is dead, while at the same time evoking race strategies against him.  [44]  Race becomes an unavoidable issue of cont Toyota: Sustainable Strategies And Global Success Toyota: Sustainable Strategies And Global Success The global auto industry is a key sector of the economy for every major country in the world. A huge invests in research, development and production result in gaining high industry performance (OICA, 2010). The high competition of key players in automotive include; BMW, FIAT, Ford, General Motor, Honda, Mitsubishi, Nissan, Peugeot Citroen, Toyota, Renault, Volkswagen, Hyundai and Daimler à ¢Ã¢â€š ¬Ã‚ ¦..(Datamonitor, 2010). Toyota Motor Corporation has become one of the most successful companies in the world today. In 2010, Toyota was ranked number 5 of the worlds largest corporations (Fortune, 2010) and number 11 of the best global brands (Interbrand, 2010). It is also considered as the most profitable organisation of automobile (Datamonitor, 2010). For almost 15 years J.D Power and other research firms have consistently rated Toyota and its luxury line, Lexus, among the top automotive brand. Over 50 years automotive operation worldwide and its launched the world first commercial hybrid car, Prius, enhance Toyota more strengthen and gaining high competitive advantage over the rivals in term of its reputation and reliability, initial quality, and long-term durability (J.D Power, 2010; Stewart and Raman, 2007). Thus, study of Toyota international business operation could be provided distinctive knowledge for researcher in term of strategic implementation from some well-known literatures and an investigation of some facts and information could enhance more analytical skills. Therefore, this report will start with the analysis of automobile industry including; market overview, market size and it competition situation. Follow by the reviews of Toyota Motor Corporation. Then, it strategic management will be examined. After that, Toyota performance assessment will be scrutinized. Strategic analysis will be provided in order to obtain deep analysis of corporation. Finally, conclusion part will be summarised all information according to the researchers study. 1. Automobile Industry 1.1 Market Overview Due to the progressive globalization, the climate change and the idea of air quality improvement, automobile manufacturers have put more efforts to create sustainable development which is to minimise fuel consumption and exhaust emission (UNEP, 2002). They have also made use of advanced technological solution to reduce waste and emission in the factories as well as to improve the vehicle safety and recycling ideas (Oliver Wyman, 2010; UNEP, 2002). Moreover, it is very crucial for the industry to continue the concept of great product design and innovative development such as the initiative in hybrid technology (Sturgeon, 2009, Oliver Wyman, 2010). In addition, Sturgeon et. al. (2009) explained four characteristics of the automobile industry which are: A small number of huge firms have more power than small companies. Eleven huge and dominant companies from three countries including Japan, Germany, and the USA. The automobile industry has developed strong regional structure along with globalisation integration. The final product assembly of vehicle has been moved to the home market due to political sensitivity issues. There are only few generic parts and component systems that can be fit to all products. Therefore, it has to be customised before production. The automobile markets condition has also been driven by globalization which can be divided into four categories Market globalisation drivers, Cost globalsation drivers, Government globalisation drivers, and Competitive globalisation drivers (YIP, 1992). The firms should understand and recognize these drivers in order to evaluate and gain their competitive advantages. Analysis of the globalisation drivers for the automotive industry is demonstrated in Figure 1. Figure 1: Yips globalization drivers of automobile industry Source: Authors own 1.2 Market Size The global market of new cars has been recovered from a decline in value in 2008. In 2009, the market rose by 2.7% and reached a value of $1,019.2 billion while its combined annual market growth rate was just 2.1% during 2005 2009. According to the pie chart shown on Figure 2, Toyota Motor Corporation took a 15.3% share of the markets volume being a leader in the market, compared to Volkswagen (14.2% ) and Ford (8.1%) (Datamonitor, 2010). Figure 2: Market size and share of automobile Source: Adapted from Datamonitor (2010) 1.3 Porters Five Forces Porters five forces (Porter, 1980) is the crucial tool to be used to analyse the important forces that determined the competitive power of automobile industry as illustrated in figure 3 understand both the strength of your current competitive position, and the strength of a position youre considering moving into. Figure 3: Porters five forces Source: Authors own Threat of New Entrants It can be said that there is low threat of new entrants in the automobile industry since it has reached the mature stage of the product life cycle. If a new company wants to enter the market, it needs to achieve economic of scale cost reduction and mass production. Consequently, the new comer is required to have a huge amount of capital in order to own its automotive manufacture and innovative technology. Moreover, it is quite difficult for a new player to have its distribution channel and dealers due to the strong channel of the key companies and their reputation. Bargaining Power of Suppliers In order to produce a vehicle, it consists of many components perform final product assembly. A lot of suppliers are then involved in production process. They are very similar due to the fact that raw materials are not different. As a result, it will be very easy to change suppliers since they have low bargaining power. Bargaining Power of Buyers Consumers are the main players in this industry since the automotive business depends on them. It is not difficult for them to switch the brands if they are not satisfied. However, in order to purchase a new car, they have to deal with a dealer only. Therefore, it can be said that there is moderately high bargaining power of the buyers in the automotive industry. Threat of Substitute Products Public transport, walking, cycling etc. can be substitute products for automobiles. Also, the geographic location has considerable effect on consumers purchasing decision. People in Venice, for example, travel by boat only. However, it will be more convenient to use automobile in order to go to most places. Consequently, threat of substitute products is moderately low. Intensity of Rivalry among Competitors There is very high intensity of rivalry among competitors due to a lack of product differentiation. It is found that the key players in the industry are fairly balanced; therefore, one can easily gain the others market share. As a consequence, in order to advertise a product, the company has to consider and compare all aspects including quality, price, durability, and others to its competitors. 1.4 Strategic Groups Analysis Porter (1980) defined a strategic group as a group of companies in an industry which implement similar strategy. According to figure 4, it can be seen that the mass market follows cost leadership strategy while differentiation strategy is implemented by the luxury group (Peng, 2009). The ultra-luxury group generally utilises focus strategy (Peng, 2009). For Toyota Motor Corporation, Toyota and Scion brands are developed by cost leadership strategy to compete in the mass market whereas it distributes Lexus brand for Luxury market (Toyota, 2010). Figure 4: Strategic group analysis Source: Adapted from Peng (2008) and Henry (2008) Furthermore, Oliver Wyman (2010) suggested that the automobile industry can be split up onto two different types of companies including mega groups and independent champions. Mega groups have their own manufacturers, technologies, platforms, and engines whereas independent champions depend on channel resources and their networks. Consequently, Toyota has been considered as maga groups due to the reasons above. This can be illustrated by successful OEM (Original Equipment Manufacturer) paradigm as shown in figure 5. Figure 5: Successful OEM diagram Source: Adapted from Oliver Wyman (2010) 2. TOYOTA Motor Corporation 2.1 Company Background Toyota Motor Corporation, a Japanese automaker, was established in 1937 by Kiichiro Toyoda (Toyota, 2010). Owning to its solid finance and an increase in demand of vehicles, it had become one of the strongest carmaker around mid-2008 (IHS, 2010). This is also because of its potential business strategy focusing on product innovation and its production efficiency (Takeuchi, Osono, and Shimizu, 2008). Finally, in 2010, Toyota was ranked number 5 of the worlds largest corporations (Fortune, 2010) and number 11 of the best global brands (Interbrand, 2010) 2.2 TOYOTA Global Vision 2020 Toyotas vision is to investigate and balance the relationship between the cycles of nature and the cycles of industry. Thus, its slogan is Open the Frontiers of Tomorrow which expresses the desire of Toyota and the efforts of its employees toward the realization of societys dreams in order to build a way to a new world. The Toyota group believes that it can be accomplished though the energy of people and technology (Figure 6) (Toyota, 2010). Figure 6: Toyota global vision 2020 Source: Adapted from Toyota (2010) 2.3 TOYOTA Biodiversity Guideline Biodiversity framework (Figure 7), one of the sustainability principles of Toyota, was developed in order to emphasise on three areas; contributions through technology, collaboration and cooperation with society, and information disclosure (Toyota, 2010). Figure 7: Toyota biodiversity guideline Source: Adapted from Toyota (2010) 2.4 TOYOTA Corporate Social Responsibility Toyota CSR policy consists of three main areas including social, environmental, and economic aspects (Figure 8). Based on its guideline principle, Toyota implements all business activities in order to create harmonious and sustainable society in each country (Toyota, 2010). Figure 8: Toyota CSR policy Source: Adapted from Toyota (2010) 3. Toyota Strategic Management 3.1 The Toyota Way In 1935, Sakichi Toyoda, the founder of the company, originally purposed five key principles to develop company and its employees beliefs (Toyota, 2010). After that, in order to provide rigorous training to the new generation of its employees, the company realised that these five important principles should be documented and divided into two pillars (Figure 9) which are Continuous improvement and Respect for people (Toyota, 2010, Stewart and Raman, 2007). For the first pillar, Continuous improvement can be called Kaizen which is Toyotas basic business management (Liker, 2004). It also focuses on individual learning and improvement by evaluating ones self and being creative in order to achieve goals. Also, Toyota presents its second pillar by providing employment security and developing employees participation and responsibility in order to build understanding, trust, and loyalty among the team members (Liker, 2004). Figure 9: The Toyota way Source: Adapted from Toyota (2010) 3.2 New JIT a Management Technology Strategy Model of Toyota A unique business strategy of Toyota called New JIT (Figure 10) has brought about its successful management and operation in the global market. New JIT is a management technology strategy model including three key areas; Toyota Marketing System (TMS), Toyota Production System (TPS), and Toyota Development System (TDS) (Amasaka, 2002, 2007). Figure 10: A management strategy model Source: Amasaka (2002) 3.3 Toyota Production System (TPS) Toyota production System (TPS) is the distinctive production system which enhances Toyota gaining higher competitive advantage over it competitors (Toyota, 2010; Amasaka, 2002; Takeuchi, Osono, and Shimizu, 2008). Taiichi Ohno invented TSP strategy in late 50s. Today this strategy became one of the most production efficiency in the world and many leading company use this platform extensively as an ideal prototype (Toyota, 2010; Takeuchi, Osono, and Shimizu, 2008). TPS designed House platform which mean that everyone can understand how it works (Figure 11). Toyotas global strategy on production is simple: Toyota Production Systems (TPS). TPS was designed by Taiichi Ohno who was in charge of production for Toyota after WWII. Ohno implemented this strategy in the 1950s and 60s and today Toyota is one of the worlds most efficient factories because of TPS. The goal of TPS is to provide the best quality, lowest cost, and shortest lead time through the elimination of waste (Toyota, 2010, Amasaka, 2002). Below is the actual Toyota Production System layout in its form of a house.(Figure 11). Figure 11: Toyota production system Source: Toyota (2010) 3.4 Key Management Drivers Takeuchi, Osono, and Shimizu (2008) pointed out the key of Toyotas success called Toyota six forces. It is influencing contradictions inside Toyota organisation can be divided into three forces of expansion and three forces of integration. Three forces of expansion causes changes and improvements in the company which are include setting impossible goals or near-unattainable goals from the view point of senior executives, local customization by producing products to suit local market needs and customise business operations under the same platform in each country and region, and experimentation that Toyotas eagerness to experiment helps it clear the hurdles that stand in the way of achieving near impossible goals. Another three forces of integration controls the balance between expansion and transformation. It can be seen this integration from values from the Toyota founders vision commitment and performance, up-and-in people management, and open communication to all level of employee closely. 4. Assessing Toyota Performance 4.1 Brand and Product Performance In the market of passenger cars, Toyota owns and operates 3 major brands; Toyota, Lexus and Scion (Toyota, 2010). Toyota offers more than 116 models distributing through different segments and provides customised models to attract local customers of each region in order to increase its sales volume (IHS, 2010). It can be seen Toyotas model strategy as shown in Figure 12. Figure 12: Model strategy Source: Adapred from Toyota (2010) 4.2 Global Operation and Expansion According to figure 13, on March 31st, 2010, Toyota had got 300,000 employees and 66 plants in 27 countries and regions with approximately 170 distributors and 8,000 dealers worldwide (Toyota, 2010). The efficient Toyota Production System (TPS) and high RD lead Toyota to a stronger position than its rivals in the automobile market (Takeuchi, Osono, and Shimizu, 2008). According to Toyota (2010) the number of vehicle had produced in FY2010 were 6,809,000 units. It was decreased slightly when compared to last four years. The major products which distribute worldwide came from Japan plants (58.1%) follow by Asia (15.6%) and North America (15.3%) (Figure 14) (Toyota, 2010). Regarding to distribution strategy, there are two main logistical concepts which are regional bases approach and dock-based approach are implemented by Toyota (Toyota, 2010). Despite the fact that Toyotas head office is located in Japan, its subsidiaries were founded over the world in order to create multiple regional headquarters including North America, Asia, Europe, and others ; (IHS, 2010). This means that the company can utilise localisation strategy to satisfy different customers needs in each country. On the other hand, dock-based operation is used to manage the inspections and quality control before distributing to the regional dealers again (Toyota, 2010). Figure 13: Toyota operation Source: Toyota (2010) Figure 14: Toyota production Source: Toyota (2010) 4.3 Financial Performance During FY 2010, Toyota posted better-than-expected results despite the global massive recall saga, which affected almost 10 million vehicles worldwide. For the 12 months ending 31 March 2010, the automaker returned to profitability on the back of swift cost-cutting measures and strong sales recovery in major markets, including Asia, rebounding from its first-ever annual loss posted during the previous fiscal year. The automaker posted a net profit of  ¥209.4 billion (US$2.25 billion) during FY 2010, compared with a net loss of  ¥437.0 billion during the previous fiscal year. Operating profit stood at  ¥147.5 billion, versus an operating loss of  ¥461.0 billion during the previous fiscal year, while pre-tax profit at the company stood at  ¥291.4 billion, compared with a pre-tax loss of  ¥560.4 billion, respectively. The significant improvement in operating earnings was largely thanks to  ¥520 billion in savings through cost-cutting efforts, and  ¥470.0 billion in savin gs through a reduction in fixed costs. Revenues at the company declined by 7.7% y/y, however, from  ¥20.5 trillion to almost  ¥19.0 trillion, as a result of the drop in vehicle demand in major markets, including North America and Europe, and unfavourable currency translation effects. Figure 15: Financial performance Source:Toyota (2010) 5 Toyota Strategic Analysis 5.1 Competitive Strategy According to Porters generic strategy (Porter, 1980), it can be said that Toyota is placed in the group of stuck in the middle. However, Thompson and Strickland (2008) mentioned this strategy could be named as Best-cost provider since Toyota offers customers more value for money low cost products with comparable quality and features. This is because of Toyotas revolutionary lean production system based on product differentiation strategy with understanding of customers, rather than a relentless pursuit of cost reduction in the cost leader category (Thompson and Strickland, 2008). As shown on Figure 16. Figure 16: Toyotas competitive strategy Source: Authors own 5.2 Core Competency One main characteristic that Toyota always emphasises is Quality. Consumers make their decision to buy this brand because of its reliability and its durability. Moreover, in order to stress its quality, it has been said that the vehicles are Made by Toyota. This means that the quality of its vehicles is exactly the same even though they are produced from different part of the world (Toyota, 2010). However, there is another remarkable feature that Toyota has made efforts to create and present in order to change consumers perception. It is Innovation. In 2000, Toyota launched Pirus Hybrid to introduce its sustainable technology (Toyota, 2010). This product gained customers attention and were sold out about a million in the mid of the year. As a result, Full hybrid has become its core competent technology in 2010 (Toyota, 2010). Last but not least, it is its unique production system called Toyota Production System (TPS). This system includes Just-In-Time (JIT), one-piece flow, Kaizen (continue to improve), Jidoka (automotive stop), and Heijunka (leveled production) leading Toyota to provide the best quality, the lowest cost, the shortest lead time, and the best safety (Amasaka, 2002). Consequently, Toyota can achieve highly competitive advantages over its competitors. 5.3 SWOT Analysis Strengths The strong reputation and brand image is a significant competitive advantage to boost companys sales in both domestic and international market. Consumers recognize Toyota brand and its popularity so they are more willing to pay for its premium products. Research and Development (RD) has been emphasised to ensure and enhance its quality and its safety. Also, Toyota always pays more attention on environmental compatibility when developing new products. With its strong and powerful distribution channel and dealer network, Toyota can distribute and sell its products through 170 distributors and 8,000 dealers across the world. Toyota Production System (TPS) has been successfully developed leading Toyota to gain efficient production with the best quality, the lowest cost, and the shortest lead-time. Weaknesses Due to two safety recalls announced by Toyota, consumers have less confidence in its products and its brand image. This also has a significant impact on companys share price causing a drop of stock exchanges in Japan and overseas. Employees pension fund has been decreasing owning to the financial instability. Toyota has been forced by regulation about unfunded pension; therefore the company cannot control its liquidity position. Opportunities Regulation about energy saving and emission provides a great opportunity to Toyota since it has been developed Prius, a hybrid vehicles over the years. Hybrid technology is one of the most valuable competitive advantages of Toyota enhancing it to gain higher market position. In the next ten years, China, India, and South East Asian countries are forecast to be a key driving automotive market. Therefore, Toyota should take this opportunity to obviously present its brand in the Asain markets which could help increase its market share effectively. Due to many new cars launched by Toyota during 2009-2010, the company will be able to gain higher customer interest contributing greatly to companys sales growth. The global automobile industry has been recovered from the economic recession and is expected to gradually accelerate in next year. Toyotas products and service can be expanded since they can be sold and distributed through its distribution network over the world. Threats Since there is high competition in the automobile industry, sale of Toyotas vehicles may be low and affect the companys finance and operation. Due to the different emission standard in different country, Toyota who sells and distributes products and service to more than 100 countries has additional cost in order to test, develop, and manufacture each product for each country. Toyotas financial status is quite sensitive to foreign currency exchange fluctuation, especially the Euro, and the US dollar affecting material cost and price of products sold in foreign currencies. Conclusion The key success of Toyota based on the integration of its competitive advantage and its business philosophy of understanding people as well as balancing cycles of nature and industry. The constant RD and excellent product innovation have a massive effect on Toyotas performance its market growth and its market share. The operational excellence is implemented as a strategic weapon in order to improve its products and quality through its production strategy. The Toyota Production System (TPS) is efficiently developed and effectively integrated with Toyota Marketing System (TMS) and Toyota Development System (TDS). Local customisations and multi-segmentations build Toyota more powerful than its rivals and gain positive perception from local consumers.