Kamis, 28 April 2016

WHAT IS INFORMATION MANAGEMENT

Information management (IM) concerns a cycle of organisational activity: the acquisition of information from one or more sources, the custodianship and the distribution of that information to those who need it, and its ultimate disposition through archiving or deletion.
This cycle of organisational involvement with information involves a variety of stakeholders: for example those who are responsible for assuring the qualityaccessibility and utility of acquired information, those who are responsible for its safe storage and disposal, and those who need it for decision making. Stakeholders might have rights to originate, change, distribute or delete information according to organisational information management policies.
Information management embraces all the generic concepts of management, including: planningorganizingstructuring,processingcontrollingevaluation and reporting of information activities, all of which is needed in order to meet the needs of those with organisational roles or functions that depend on information.
Information management is closely related to, and overlaps with, the management of datasystemstechnologyprocesses and – where the availability of information is critical to organisational success – strategy. This broad view of the realm of information management contrasts with the earlier, more traditional view, that the life cycle of managing information is an operational matter that requires specific procedures, organisational capabilities and standards that deal with information as a product or a service.


What is learned in the management of informatics among them , Subjects Taught at the Department of Information Management :
  • INTRODUCTION TO ECONOMY
  • ELECTRONIC DATA PROCESSING
  • LOGIC AND ALGORITHM
  • CALCULUS
  • PROGRAMMING
  • QUICK BASIC PROGRAMMING
  • ENTREPRENEURSHIP
  • BASIC ACCOUNTING
  • LINEAR ALGEBRA
  • DESCRIPTIVE STATISTICS
  • INTRODUCTION TO BUSINESS
  • OPERATING SYSTEM
  • PASCAL PROGRAMMING
  • BEHAVIOR IN ORGANIZE
  • STRUCTURAL DATA
  • VISUAL BASIC PROGRAMMING
  • COMMERCE PROGRAMMING PACKAGE
  • ANALYSIS OF INFORMATION SYSTEMS
  • GENERAL MANAGEMENT
  • ENGINEERING OPERATIONS RESEARCH
  • STATISTICAL PROBABILITY
  • STRUCTURED PROGRAMMING
  • DATA COMMUNICATION
  • DELPHI PROGRAMMING
  • MANAGEMENT INFORMATION SYSTEM
  • COMPUTER GRAPHICS
  • SYSTEM DATA BASE
  • INFORMATION SYSTEM
  • INTERNET AND MANAGEMENT WEBSITE
  • RESEARCH METHODS
  • COMPUTER AIDED DESAIGN (CAD)
  • INSTALLATION SOFTWARE HARDWARE
  • COMPUTERIZED ACCOUNTING
  • BANKING AND TAXATION

WHAT DO YOU THINK ABOUT SEO?

"SEO"
Search engine optimization (SEO) is the process of affecting the visibility of a website or a web page in a search engine's unpaid results—often referred to as "natural," "organic," or "earned" results. In general, the earlier (or higher ranked on the search results page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users, and these visitors can be converted into customers. SEO may target different kinds of search, including image searchlocal searchvideo searchacademic search, news search and industry-specific vertical search engines.
As an Internet marketing strategy, SEO considers how search engines work, what people search for, the actual search terms or keywords typed into search engines and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content, HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. As of May 2015, mobile search has finally surpassed desktop search,Google is developing and pushing mobile search as the future in all of its products and many brands are beginning to take a different approach on their internet strategies.

History

Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed to do was to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, and all links the page contains, which are then placed into a scheduler for crawling at a later date.
Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as being one of the first people to popularize the term. On May 2, 2007, Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona that SEO is a "process" involving manipulation of keywords, and not a "marketing service."
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using meta data to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.  Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.
In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimisation and related topics.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company,Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients. Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban. Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. Major search engines provide information and guidelines to help with site optimization. Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website. Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the crawl rate, and track the web pages index status.