Ajax


Ajax Technology in Web


AJAX

Ajax, shorthand for Asynchronous JavaScript
and XML

• Web development technique for creating
interactive web applications

• The intent is to make web pages feel more
responsive by exchanging small amounts of
data with the server behind the scenes, so
that the entire web page does not have to be
reloaded each time the user makes a change

• This is meant to increase the web page's
interactivity, speed, and usability
The first known use of the term in public
was by Jesse James Garrett in his
February 2005 article Ajax: A New
Approach to Web Applications

• At subsequent talks and seminars
Garrett has made the point that Ajax is
not an acronym
Ajax Technology

Ajax Technology

The Ajax technique uses a combination of:

– XHTML (or HTML), CSS, for marking up and styling information.

– The DOM accessed with a client-side scripting language,
especially ECMAScript implementations such as JavaScript
and JScript, to dynamically display and interact with the
information presented.

– The XMLHttpRequest object to exchange data asynchronously
with the web server. In some Ajax frameworks and in certain
situations, an IFrame object is used instead of the
XMLHttpRequest object to exchange data with the web server.

– XML is sometimes used as the format for transferring data
between the server and client, although any format will work,
including preformatted HTML, plain text, JSON and other
formats.

• Like DHTML, LAMP, or SPA, Ajax is not a technology in
itself, but a term that refers to the use of a group of
technologies together.

XMLHttpRequest

• XMLHttpRequest is an API that can be
used by JavaScript, JScript, VBScript
and other web browser scripting
languages to transfer and manipulate
XML data to and from a web server
using HTTP, establishing an
independent connection channel
between a web page's Client-Side and
Server-Side.

• The XMLHttpRequest concept was originally
developed by Microsoft.

• The Microsoft implementation is called
XMLHTTP and, as an ActiveX object, it differs
from the published standard in a few small
ways. It has been available since Internet
Explorer 5.0 and is accessible via JScript,
VBScript and other scripting languages
supported by IE browsers.

• The Mozilla project incorporated the first
compatible native implementation of
XMLHttpRequest in Mozilla 1.0 in 2002.

• This implementation was later followed
by Apple since Safari 1.2, Konqueror,
Opera Software since Opera 8.0 and
iCab since 3.0b352.

• The World Wide Web Consortium published a
Working Draft specification for the
XMLHttpRequest object's API on 5 April
2006.

• While this is still a work in progress, its goal is
"to document a minimum set of interoperable
features based on existing implementations,
allowing Web developers to use these
features without platform-specific code".

• The draft specification is based upon existing
popular implementations, to help improve and
ensure interoperability of code across web
platforms.

• Methods:
– abort()
– getAllResponseHeaders()
– getResponseHeader(header)
– open(method, url, asyncronous, user,
password):
– send(content)
– setRequestHeader(header, value)


• open(method, url, async,
user, password):
– Initializes an XMLHTTP request.
– Specifies the method, URL, and
authentication information for the request.
– After calling this method, you must call
send to send the request and data, if any,
to the server.

• send(content):
– Sends an HTTP request to the server and
receives a response.
– null for no data.

• Properties:
– onreadystatechange
– readyState
– responseText
– responseXML
– status
– statusText

• onreadystatechange:
– Function than handles the different events

• readyState:
– The property is read-only
– It represents the state of the request as an
integer
– The following values are defined:

• readyState:
– 0 (UNINITIALIZED): The object has been created, but not
initialized (the open method has not been called)
– (1) LOADING: The object has been created, but the send method has not been called.
– (2) LOADED: The send method has been called, but the
status and headers are not yet available.
– (3) INTERACTIVE: Some data has been received. Calling
the responseText property at this state to obtain partial
results will return an error, because status and response
headers are not fully available.
– (4) COMPLETED: All the data has been received, and the
complete data is available in the responseText property.

• readyState:
– 0 (UNINITIALIZED): The object has been created, but not
initialized (the open method has not been called)
– (1) LOADING: The object has been created, but the send
method has not been called.
– (2) LOADED: The send method has been called, but the
status and headers are not yet available.
– (3) INTERACTIVE: Some data has been received. Calling
the responseText property at this state to obtain partial
results will return an error, because status and response
headers are not fully available.
– (4) COMPLETED: All the data has been received, and the
complete data is available in the responseText property

• responseText:
– The property is read-only.
– This property represents only one of
several forms in which the HTTP response
can be returned.

• responseXML:
– The property is read-only.
– This property represents the parsed
response entity body.

AJAX step by step

1. Create XMLHttpRequest object
2. Assign a function to the state change event
3. Send a request to the server
4. On a state change, manage the response
5. On a correct response, process the result
and show to the user.

Create XMLHttpRequest object

• Depending on the browser:
– Internet Explorer
request = new ActiveXObject("Microsoft.XMLHTTP");
– Otros navegadores:
request = new XMLHttpRequest();

• Code adapted for different browsers:
if(window.XMLHttpRequest) {
request = new XMLHttpRequest();
}
else if(window.ActiveXObject) {
request = new ActiveXObject("Microsoft.XMLHTTP");

Assign a function to the state change event

• This function will be called
automatically, every time the state of
the XMLHttpRequest object changes:
request.onreadystatechange = nameOfFunction
Important: without “( )”, only the name.

Send a request to the server
• Open the connection, define the method and
the type of connection:
– A synchronous connection (false) blocks the
browser until the response is obtained
– An asynchronous connection (true and default
value) executes on the background
– Important: the URL must belong to the same
domain of the current page
request.open('GET','http://www.ua.es/ajax.jsp',
true);
• Send the additional data:request.send(data or null)

On a state change, manage the response
• The handler is called every time there is a change:
• 0: UNINITIALIZED
• 1: LOADING
• 2: LOADED
• 3: INTERACTIVE
• 4: COMPLETED
• Example of handler:
if (request.readyState == 4) { // Finished
if (request.status==200) { // OK
// Process the result
}
}
else {
// Not finished
}

On a correct response, process the result and
show to the user

• The result can be in different formats:
plain text, HTML, JSON, XML, etc.

• responseText when not structured
result as XML:
alert(request.responseText);

• responseXML when structured result
as XML:
– Returns an XMLDocument object
– Use DOM functions

Example

<script type="text/javascript">
function ajaxFunction() {
var xmlHttp;
if (window.XMLHttpRequest)
xmlHttp = new XMLHttpRequest();
else
xmlHttp = new ActiveXObject("Microsoft.XMLHTTP");
xmlHttp.onreadystatechange=function() {
if(xmlHttp.readyState == 4) {
document.myForm.time.value += xmlHttp.responseText + "\n";
}
}
xmlHttp.open("GET","time.php",true);
xmlHttp.send(null);
}
</script>

Anather one Example

Example
<html>
<head>
<title>Ajax example</title>
<!-- script -->
</head>
<body>
<form name="myForm">
Name: <input type="text"
onkeyup="ajaxFunction();" name="username" />
<br />
Time: <textarea name="time" cols="40"
rows="10"></textarea>
</form>
</body>
</html>

• PHP:
<?php
header("Expires: -1");
$str1 = date('h:i:s A');
sleep(2);
$str2 = date('h:i:s A');
echo "$str1 -- $str2";
?>

Who’s Using Ajax

Google is making a huge investment in developing the Ajax approach. All of the major products Google has introduced over the last
year — Orkut, Gmail, the latest beta version of Google Groups, Google Suggest, and Google Maps — are Ajax applications. (For more
on the technical nuts and bolts of these Ajax implementations, check out these excellent analyses of Gmail, Google Suggest, and
Google Maps.) Others are following suit: many of the features that people love in Flickr depend on Ajax, and Amazon’s A9.com
search engine applies similar techniques.

These projects demonstrate that Ajax is not only technically sound, but also practical for real-world applications. This isn’t another
technology that only works in a laboratory. And Ajax applications can be any size, from the very simple, single-function Google
Suggest to the very complex and sophisticated Google Maps.

At Adaptive Path, we’ve been doing our own work with Ajax over the last several months, and we’re realizing we’ve only scratched the
surface of the rich interaction and responsiveness that Ajax applications can provide. Ajax is an important development for Web
applications, and its importance is only going to grow. And because there are so many developers out there who already know how to use these technologies, we expect to see many more organizations following Google’s lead in reaping the competitive advantage Ajax
provides.

5G Mobile Technology


5G MOBILE TECHNOLOGIES



Introduction


The present cell phones have it all. Today phones have everything ranging from the smallest size, largest phone memory, speed dialing, video player, audio player, and camera and so on. Recently with the development of Pico nets and Blue tooth technology data sharing has become a child's play. Earlier with the infrared feature you can share data within a line of sight that means the two devices has to be aligned properly to transfer data, but in case of blue tooth you can transfer data even when you have the cell phone in your pocket up to a range of 50 meters. The creation and entry of 5G technology into the mobile marketplace will launch a new revolution in the way international cellular plans are offered.

The global mobile phone is upon the cell phone market. Just around the corner, the newest 5G technologies will hit the mobile market with phones used in China being able to access and call locally phones in Germany. Truly innovative technology changing the way mobile phones will be used. With the emergence of cell phones, which are similar to a PDA, you can now have your whole office within the phone. Cell phones will give tough competitions to laptop manufacturers and normal computer designers. Even today there are phones with gigabytes of memory storage and the latest operating systems. Thus one can say that with the current trends, the industry has a real bright future if it can handle the best technologies and can produce affordable handsets for its customers. Thus you will get all your desires unleashed in the near future when these smart phones take over the market. 5G Network's router and switch technology delivers Last Yard Connectivity between the Internet access provider and building occupants. 5G's technology intelligently distributes Internet access to individual nodes within the building.

2G-5G Networks

The first generation of mobile phones was analog systems that emerged in the early 1980s. The second generation of digital mobile phones appeared in 1990s along with the first digital mobile networks. During the second generation, the mobile telecommunications industry experienced exponential growth in terms of both subscribers and value-added services. Second generation networks allow limited data support in the range of 9.6 kbps to 19.2 kbps. Traditional phone networks are used mainly for voice transmission, and are essentially circuit-switched networks.
5G networks, such as General Packet Radio Service (GPRS), are an extension of 2G networks, in that they use circuit switching for voice and packet switching for data transmission resulting in its popularity since packet switching utilizes bandwidth much more efficiently. In this system, each user’s packets compete for available bandwidth, and users are billed only for the amount of data transmitted.
3G networks were proposed to eliminate many problems faced by 2G and 2.5G networks, especially the low speeds and incompatible technologies such as Time Division Multiple Access (TDMA) and Code Division Multiple Access (CDMA) in different countries. Expectations for 3G included increased bandwidth; 128 Kbps for mobile stations, and 2 Mbps for fixed applications. In theory, 3G should work over North American as well as European and Asian wireless air interfaces. In reality, the outlook for 3G is not very certain. Part of the problem is that network providers in Europe and North America currently maintain separate standards’ bodies (3GPP for Europe and Asia; 3GPP2 for North America). The standards’ bodies have not resolved the differences in air interface technologies.
There is also a concern that in many countries 3G will never be deployed due to its cost and poor performance. Although it is possible that some of the weaknesses at physical layer will still exist in 4G systems, an integration of services at the upper layer is expected. The evolution of mobile networks is strongly influenced by business challenges and the direction mobile system industry takes. It also relates to the radio access spectrum and the control restrictions over it that varies from country to country. However, as major technical advances are being standardized it becomes more complex for industry alone to choose a suitable evolutionary path. Many mobile system standards for Wide Area Networks (WANs) already exists including the popular ones such as Universal Mobile Telecommunications Systems (UMTS), CDMA, and CDMA-2000 (1X/3X). In addition there are evolving standards for Personal Area Networks (PANs), such as Bluetooth wireless, and for WLANs, such as IEEE 802.11.
The current trend in mobile systems is to support the high bit rate data services at the downlink via High Speed Downlink Packet Access (HSDPA). It provides a smooth evolutionary path for UMTS networks to higher data rates in the same way as Enhanced Data rates for Global Evolution (EDGE) do in Global Systems for Mobile communication (GSM). HSPDA uses shared channels that allow different users to access the channel resources in packet domain. It provides an efficient means to share spectrum that provides support for high data rate packet transport on the downlink, which is well adapted to urban environment and indoor applications. 9. Initially, the peak data rates of 10 Mbps may be achieved using HSPDA. The next target is to reach 30 Mbps with the help of antenna array processing technologies followed by the enhancements in air interface design to allow even higher data rates. Another recent development is a new framework for mobile networks that is expected to provide multimedia support for IP telecommunication services, called as IP Multimedia Subsystems (IMS). Real-time rich multimedia communication mixing telecommunication and data services could happen due to IMS in wireline broadband networks. However, mobile carriers cannot offer their customers the freedom to mix multimedia
components (text, pictures, audio, voice, video) within one call. Today a two party voice call cannot be extended to a multi-party audio and video conference. IMS overcomes such limitations and makes these scenarios possible.

Network Architecture

The basic architecture of wireless mobile system consists of a mobile phone connected to the wired world via a single hop wireless connection to a Base Station (BS), which is responsible for carrying the calls within its region called cell (Figure 1). Due to limited coverage provided by a BS, the mobile hosts change their connecting base stations as they move from one cell to another.

{{Wireless Mobile System Network Architecture}}

A hand-off (later referred to as “horizontal handoff” in this article) occurs when a mobile system changes its BS. The mobile station communicates via the BS using one of the wireless frequency sharing technologies such as FDMA, TDMA, CDMA etc. Each BS is connected to a Mobile Switching Center (MSC) through fixed links, and each MSC is connected to others via Public Switched Telephone Network (PSTN). The MSC is a local switching exchange that handles switching of mobile user from one BS to another. It also locates the current cell location of a mobile user via a Home Location Register (HLR) that stores current location of each mobile that belongs to the MSC. In addition, the MSC contains a Visitor Locations Register (VLR) with information of visiting mobiles from other cells. The MSC is responsible for determining the current location of a target mobile using HLR, VLR and by communicating with other MSCs. The source MSC initiates a call setup message to MSC covering target area for this purpose.

The first generation cellular implementation consisted of analog systems in 450-900 MHz frequency range using frequency shift keying for signaling and Frequency Division Multiple Access (FDMA) for spectrum sharing. The second generation implementations consist of TDMA/CDMA implementations with 900, 1800 MHz frequencies. These systems are called GSM for Europe and IS-136 for US. The respective 2.5G implementations are called GPRS and CDPD followed by 3G implementations. Third generation mobile systems are intended to provide a global mobility with wide range of services including voice calls, paging, messaging, Internet and broadband data. IMT-2000 defines the standard applicable for North America. In Europe, the equivalent UMTS standardization is in progress. In 1998, a Third Generation Partnership Project (3GPP) was formed to unify and continue the technical specification work. Later, the Third Generation Partnership Project 2 (3GPP2) was formed for technical development of CDMA-2000 technology.
 3G mobile offers access to broadband multimedia services, which is expected to become all IP based in future 4G systems. However, current 3G networks are not based on IP; rather they are an evolution from existing 2G networks. Work is going on to provide 3G support and Quality of Service (QoS) in IP and mobility protocols. The situation gets more complex when we consider the WLAN research and when we expect it to become mobile. It is expected that WLANs will be installed in trains, trucks, and buildings. In addition, it may just be formed on an ad-hoc basis (like ad-hoc networks) between random collections of devices that happen to come within radio range of one another (Figure 2). In general, 4G architecture includes three basic areas of connectivity; PANs (such as Bluetooth), WANs (such as IEEE 802.11), and cellular connectivity. Under this umbrella, 4G will provide a wide range of mobile devices that support global roaming.
Each device will be able to interact with Internet-based information that will be modified on the fly for the network being used by the device at that moment (Figure 3). In 5G mobile IP, each cell phone is expected to have a permanent "home" IP address, along with a "care-of" address that represents its actual location. When a computer somewhere on the Internet needs to communicate with the cell phone, it first sends a packet to the phone's home address.

A directory server on the home network forwards this to the care-of address via a tunnel, as in regular mobile IP. However, the directory server also sends a message to the computer informing it of the correct care-of address, so future packets can be sent directly. This should enable TCP sessions and HTTP downloads to be maintained as users move between different types of networks. Because of the many addresses and the multiple layers of sub netting, IPv6 is needed for this type of mobility. For instance, 128 bits (4 times more than current 32 bit IPv4 address) may be divided into four parts (I thru IV) for supporting different functions. The first 32-bit part (I) may be defined as the home address of a device while the second part (II) may be declared as the care-of address allowing communication between cell phones and personal computers. So once the communication path between cell and PC is established, care-of address will be used instead of home address thus using the second part of IPv6 address.

The third part (III) of IPv6 address may be used for tunneling to establish a connection between wire line and wireless network. In this case an agent (a directory server) will use the mobile IP address to establish a channel to cell phones. The fourth and last part (IV) of IPv6 address may be used for local address for VPN sharing. Figure 4 illustrates the concept. The goal of 4G and 5G is to replace the current proliferation of core mobile networks with a single worldwide core network standard, based on IPv6 for control, video, packet data, and voice. This will provide uniform video, voice, and data services to the mobile host, based entirely on IPv6.  The objective is to offer seamless multimedia services to users accessing an all IP-based infrastructure through heterogeneous access technologies. IPv6 is assumed to act as an adhesive for providing global connectivity and mobility among networks.  Most of the wireless companies are looking forward to IPv6, because they will be able to introduce new services. The Japanese government is requiring all of Japan's ISPs to support IPv6 with its first 4G launch. Although the US upgrade to IPv6 is less advanced, WLAN’s advancement may provide a  shortcut to 4G.

Mix-Bandwidth Data Path Design

CDMA development group (CDG) has issued convergence architecture for 4G, which combined pico cell, micro cell, macro cell and global area shown in Figure5. This architecture clearly shows that in pico-cell area, there are four wireless network covered, in micro cell area, there are three wireless network covered, in macro cell area, there are two wireless network covered at least. The problem is for any users at a certain place and time, it is one network supply wireless services for them, the others keep wireless network resources waste. 5G is real wireless world, it is completed wireless communication. We design mix-bandwidth data path for 5G so that all wireless network resource can be used efficiently.

Mix-Bandwidth Data Path Model Design

In order to design mix-bandwidth data path, we propose a new data model as shown in Figure6. This model based on any two networks overlay area. When a mobile node comes into the overlay area, both of the two networks can supply services for the mobile node simultaneously. Data request can be sent from any one network, and reply can be from any other network.

{{Fig: Mix-bandwidth Data Path Model}}
In this model, the MN request can go through the first connection (MN → BS → PDSN → CN) and the resulting reply can come from the second connection (CN → PDSN → AP → MN). Thus, two networks supply services for the mobile node simultaneously. Following this model, we propose mix-bandwidth data path shown in Figure, which contains four components. They are bandwidth management, bandwidth selection, packet receiver and bandwidth monitor.

Mobile - Wireless Grids
Mobile computing is an aspect that plays seminal role in the implementation of 4G Mobile Communication Systems since it primarily centers upon the requirement of providing access to various communications and services everywhere, any time and by any available means. Presently, the technical solutions for achieving mobile computing are hard to implement since they require the creation of communication infrastructures and the modification of operating systems, application programs and computer networks on account of limitations on the capability of a moving resource in contrast to a fixed one.
 In the purview of Grid and Mobile Computing, Mobile Grid is a heir of Grid, that addresses mobility issues, with the added elements of supporting mobile users and resources in a seamless, transparent, secure and efficient way. It has the facility to organize underlying ad-hoc networks and offer a self-configuring Grid system of mobile resources (hosts and users) connected by wireless links and forming random and changeable topologies. The mobile Grid needs to be upgraded from general Grid concept to make full use of all the capabilities that will be available; these functionalities will involve end-to-end solutions with emphasis on Quality of Service (QoS) and security, as well as interoperability issues between the diverse technologies involved. Further, enhanced security policies and approaches to address large scale and heterogeneous environments will be needed. Additionally, the volatile, mobile and poor networked environments have to be addressed with adaptable QoS aspects which have to be contextualized with respect to users and their profiles.

 Wireless Grids

Grid computing lets devices connected to the Internet, overlay peer-to-peer networks, and the nascent wired computational grid dynamically share network connected resources in 4G kind of scenario. The wireless grid extends this sharing potential to mobile, nomadic, or fixed-location devices temporarily connected via ad hoc wireless networks. Following Metcalfe’s law, grid-based resources become more valuable as the number of devices and users increases. The wireless grid makes it easier to extend grid computing to large numbers of devices that would otherwise be unable to participate and share resources. While grid computing attracts much research, resource sharing across small, ad hoc, mobile, and nomadic grids draws much less. Wireless grids, a new type of resource-sharing network, connect sensors, mobile phones, and other edge devices with each other and with wired grids. Ad hoc distributed resource sharing allows these devices to offer new resources and locations of use for grid computing. In some ways, wireless grids resemble networks already found in connection with agricultural, military, transportation, air-quality, environmental, health, emergency, and security systems.

{{Dynamic and fixed wireless grids}}

A range of institutions, from the largest governments to very small enterprises, will own and at least partially control wireless grids. To make things still more complex for researchers and business strategists, users and producers could sometimes be one and the same. Devices on the wireless grid will be not only mobile but nomadic shifting across institutional boundaries. Just as real-world nomads cross institutional boundaries and frequently move from one location to another, so do wireless devices. The following classification offers one way to classify wireless grid applications.

a)
Class 1: Applications aggregating information from the range of input/output interfaces found in nomadic devices.

(b) Class 2: Applications leveraging the locations and contexts in which the devices exist.

(c) Class 3: Applications leveraging the mesh network capabilities of groups of nomadic devices.

The three classes of wireless grid applications conceptualized here are not mutually exclusive. Understanding more about the shareable resources, the places of use, and ownership and control patterns within which wireless grids will operate might assist us in visualizing these future patterns of wireless grid use. The Grid, is a promising emerging technology that enables the simple “connect and share” approach analogously to the internet search engines that apply the “connect and acquire information” concept. Thus, mobile/wireless grids is an ideal solution for large scale applications which are the pith of 4G mobile communication systems, besides, this grid-based-approach will potentially increase the performance of the involved applications and utilization rate of resources by employing efficient mechanisms for resource management in the majority of its resources, that is, by allowing the seamless integration of resources, data, services and technologies. Figure 2 places wireless grids in context, illustrating how they span the technical approaches and issues of Web services, grid computing, P2P systems, mobile commerce, ad hoc networking, and spectrum management. How sensor and mesh networks will ultimately interact with software radio and other technologies to solve wireless grid problems requires a great deal of further research, but Figure 4 at least captures many of the main facets of a wireless grid.

Key Concepts of 5G

Suggested in research papers discussing 5G and beyond 4G wireless communications are:
(a) Real wireless world with no more limitation with access and zone issues.
(b) Wearable devices with AI capabilities.
(c) Internet protocol version 6 (IPv6), where a visiting care-of mobile IP address is assigned according to location and connected network.
(d) One unified global standard.
(e) Pervasive networks providing ubiquitous computing: The user can simultaneously be connected to several wireless access technologies and seamlessly move between them (See Media independent handover or vertical handover, IEEE 802.21, also expected to be provided by future 4G releases). These access technologies can be a 2.5G, 3G, 4G or 5G mobile networks, Wi-Fi, WPAN or any other future access technology. In 5G, the concept may be further developed into multiple concurrent data transfer paths.
(f) Cognitive radio technology, also known as smart-radio: allowing different radio technologies to share the same spectrum efficiently by adaptively finding
unused spectrum and adapting the transmission scheme to the requirements of the technologies currently sharing the spectrum. This dynamic radio resource management is achieved in a distributed fashion, and relies on software defined radio.
(g) High altitude stratospheric platform station (HAPS) systems.

The radio interface of 5G communication systems is suggested in a Korean research and development program to be based on beam division multiple access (BDMA) and group cooperative relay techniques.

Features of 5G Networks Technology

Main features of 5G Network technology are as follows :
(a) 5G technology offer high resolution for crazy cell phone user and bi-directional large bandwidth shaping.
(b) The advanced billing interfaces of 5G technology makes it more attractive and effective.
(c) 5G technology also providing subscriber supervision tools for fast action.
(d) The high quality services of 5G technology based on Policy to avoid error.
(e) 5G technology is providing large broadcasting of data in Gigabit which supporting almost 65,000 connections.
(f) 5G technology offer transporter class gateway with unparalleled consistency.
(g) The traffic statistics by 5G technology makes it more accurate.
(h) Through remote management offered by 5G technology a user can get better and fast solution.
(i) The remote diagnostics also a great feature of 5G technology.
(j) The 5G technology is providing up to 25 Mbps connectivity speed.
(k) The 5G technology also support virtual private network.
(l) The new 5G technology will take all delivery service out of business prospect.
(m) The uploading and downloading speed of 5G technology touching the peak.
(n) The 5G technology network offering enhanced and available connectivity just about the world.

A new revolution of 5G technology is about to begin because 5G technology going to give tough completion to normal computer and laptops whose marketplace value will be effected. There are lots of improvements from 1G, 2G, 3G, and 4G to 5G in the world of telecommunications. The new coming 5G technology is available in the market in affordable rates, high peak future and much reliability than its preceding technologies. Features that are getting embedded in such a small piece of electronics are huge. Today you will hardly witness a cell phone without an mp3 player with huge storage memory and a camera. We can use the cell phone as a Walkman.

Even every latest set being launched by the cell phone companies have a mega pixel camera in it, which produces extraordinary digital image just like a specialized camera for photography. Here are some an examples about mobile technology in our future, A man’s phone detects that it hasn’t moved for more than 2 hours during the man’s regular waking hours. It issues an audible alarm, but no response! So it emits a signal that triggers a RFID chip implanted inside his body. The RFID chip responds by verifying the identity of the man and also a brief burst of telemetry that indicates that he is experiencing heart beat irregularities and his blood pressure is dangerously low. The phone quickly sends an automated text message to a medical alarm system, including not only the identity and the health data of the owner but also the fact that the man is not in his own apartment but in a reading room of a library.

Conclusion


 There are some other projects, which are undertaken ay 5G technologies. Here we want to mention that 3G mobiles are working these days, and 4G technologies are coming, but in future we are ready to face 5G technologies and some of its features we have presented in this paper.

Web Mining

Web Mining Introduction

 What is Web Mining?   
 


          Web mining is the use of data mining techniques to automatically discover and extract information from Web documents/services.Web mining deals with three main areas: web content mining, web usage mining and web structure mining. In web usage mining it is desirable to find the habits and relations between what the website’s users are looking for. To find the actual users some filtering has to be done to remove bots that indexes structures of a website. 
         Robots view all pages and links on a website to find relevant content. This creates many calls to the website server and thereby creates a false image of the actual web usage. The paper we have chosen to start with [Tang et al. 2002] does not in depth discuss web content and web structure mining, but instead look closer upon web usage mining. This field is supposed to describe relations between web pages based on the interests of users, i.e. finding links often clicked in a specific order which are of greater relevance to the user. The patterns revealed will then be used to create a more visitor customized website by highlighting or otherwise expose web pages to increase commerce.                This is often demonstrated as a price cut in one product which will increase sales in another. On the other hand it is also important to not to misclassify actual users that make thorough searches of websites and label them as robots. 

 1. Motivation / Opportunity The WWW is huge, widely distributed, global information service centre and, therefore, constitutes a rich source for data mining.

 01 Personalization, Recommendation Engines.
 02 Web-commerce applications. 
 03 Building the Semantic Web. 
 04 Intelligent Web Search.
 05 Hypertext classification and Categorization. 
 06 Information / trend monitoring. 
 07 Analysis of online communities. 

 2. The Web 
 01. Over 1 billion HTML pages, 15 terabytes
 02. Wealth of information a. Bookstores, restaraunts, travel, malls, dictionaries, news, stock quotes,yellow & white pages, maps, markets, ......... 
 b. Diverse media types: text, images, audio, video c. Heterogeneous formats: HTML, XML, postscript, pdf, JPEG, MPEG, MP3 
03. Highly Dynamic a. 1 million new pages each day b. Average page changes in a few weeks 
04. Graph structure with links between pages a. Average page has 7-10 links b. in-links and out-links follow power-law distribution 
05. Hundreds of millions of queries per day 

 3. Abundance and authority crisis. 
 01. Liberal and informal culture of content generation and dissemination 
 02. Redundancy and non-standard form and content 
 03. Millions of qualifying pages for most broad queries Example: java or kayaking 
 04. No authoritative information about the reliability of a site 
 05. Little support for adapting to the background of specific users 

 4. One Interesting Approach 
 01. The number of web servers was estimated by sampling and testing random IP address numbers and determining the fraction of such tests that successfully located a web server 
 02. The estimate of the average number of pages per server was obtained by crawling a sample of the servers identified in the first experiment 
 03. Lawrence, S. and Giles, C. L. (1999). Accessibility of information on the web. Nature, 400(6740): 107–109. 

 4. Applications of web mining a.
 a. E-commerce (Infrastructure) 
 b. Generate user profiles -> improving customization and provide users with pages, advertisements of interest 
 c. Targeted advertising -> Ads are a major source of revenue for Web portals (e.g., Yahoo, Lycos) and E-commerce sites. 
Internet advertising is probably the “hottest” web mining application today 
 d. Fraud -> Maintain a signature for each user based on buying patterns on the Web (e.g., amount spent, categories of items bought). If buying pattern changes significantly, then signal fraud  
 e. Network Management 
 f. Performance management -> Annual bandwidth demand is increasing ten-fold on average, annual bandwidth supply is rising only by a factor of three. Result is frequent congestion. During a major event (World cup), an overwhelming number of user requests can result in millions of redundant copies of data flowing back and forth across the world 
 g. Fault management -> analyze alarm and traffic data to carry out root cause analysis of faults 
 h. Information retrieval (Search) on the Web 
i. Automated generation of topic hierarchies 
j. Web knowledge bases 

 5. Why is Web Information Retrieval Important? 
 a. According to most predictions, the majority of human information will be available on the Web in ten years
 b. Effective information retrieval can aid in 
 c. Research: Find all papers about web mining 
 d. Health/Medicene: What could be reason for symptoms of “yellow eyes”. 
 e.high fever and frequent vomitting 
 f. Travel: Find information on the tropical island of St. Lucia
 g. Business: Find companies that manufacture digital signal processors
 h. Entertainment: Find all movies starring Marilyn Monroe during the years 1960 and 1970
 i. Arts: Find all short stories written by Jhumpa Lahiri 

 6. Why is Web Information Retrieval Difficult? 
 a. The Abundance Problem (99% of information of no interest to 99% of people)
 b. Hundreds of irrelevant documents returned in response to a search query
 c. Limited Coverage of the Web (Internet sources hidden behind search interfaces)
 d. Largest crawlers cover less than 18% of Web pages
 e. The Web is extremely dynamic 
f. Lots of pages added, removed and changed every day 
g. Very high dimensionality (thousands of dimensions) 
h. Limited query interface based on keyword-oriented search 
i. Limited customization to individual users 

 6. Web Mining Taxonomy 
 01. Web content mining: focuses on techniques for assisting a user in finding documents that meet a certain criterion (text mining) 
 02. Web structure mining: aims at developing techniques to take advantage of the collective judgement of web page quality which is available in the form of hyperlinks 
 03. Web usage mining: focuses on techniques to study the user behaviour when navigating the web (also known as Web log mining and clickstream analysis) 

 7. Web Content Mining 

 01. Can be thought of as extending the work performed by basic search engines 
 02. Search engines have crawlers to search the web and gather information, indexing techniques to store the information, and query processing support to provide information to the users 
 03. Web Content Mining is: the process of extracting knowledge from web contents 

 8. Structuring Textual Information
 01. Many methods designed to analyze structured data
 02. If we can represent documents by a set of attributes we will be able to use existing data mining methods
 03. How to represent a document?
 04. Vector based representation (referred to as “bag of words” as it is invariant to permutations)
 05. Use statistics to add a numerical dimension to unstructured text. 

 9. Text Mining 
 01. Document classification
 02. Document clustering
 03. Key-word based association rules

 10. Web Search 
1.Domain-specific search engines 
a. www.buildingonline.com 
b. www.lawcrawler.com
c. www.drkoop.com (medical) 
d. Meta-searching 
e. Connects to multiple search engines and combine the search results f. www.metacrawler.com g. www.dogpile.com
 h. www.37.com 

 2. Post-retrieval analysis and visualization
 a. www.vivisimo.com 
b. www.tumba.pt 
c. www.kartoo.com 
d. Natural language processing 
e. www.askjeeves.com 
f. Search Agents 
g. Instead of storing a search index, search agents can perform realtime searches on the Web.
h. Fresher data, slower response time and lower coverage. 

11. Web Structure Mining First generation of search engines
 01. Early days: keyword based searches
      a. Keywords: “web mining”
      b. Retrieves documents with “web” and mining”
02. Later on: cope with
a. synonymy problem
b. polysemy problem
c. stop words
03. Common characteristic: Only information on the pages is used.

12. Modern search engines

 01. Link structure is very important a. Adding a link: deliberate act b. Harder to fool systems using in-links c. Link is a “quality mark”
 02. Modern search engines use link structure as important source of information.
 1. The Web Structure
      a. If the web is treated as an undirected graph 90% of the pages form a single connected component    
      b. If the web is treated as a directed graph four distinct components are identified, the four with similar size.

 13. Some statistics
 01. Only between 25% of the pages there is a connecting path BUT
 02. If there is a path:
        a. Directed: average length <17 -="" 03.="" a="" average="" b.="" it="" length="" s="" small="" undirected:="" world=""> between two people only chain of length 6! 
a. High number of relatively small cliques >
b. Small diameter 

04. Internet (SCC) is a small world graph.
14. Applications Web mining is an important tool to gather knowledge of the behaviour of Websites’ visitors and thereby to allow for appropriate adjustments and decisions with respect to Websites’ actual users and traffic patterns. Along with a description of the processes involved in Web mining [Srivastava, 1999] states that Website Modification, System Improvement, Web Personalization and Business Intelligence are four majomajor application areas for Web mining. These are briefly described in the following sections.

15. Website Modification The content and structure of the Website is important to the user experience/impression of the site and the site’s usability. The problem is that different types of users have different preferences, background, knowledge etc. making it difficult (if not impossible) to find a design that is optimal for all users. Web usage mining can then be used to detect which types of users are accessing the website, and their behaviour, knowledge which can then be used to manually design/re-design the website, or to automatically change the structure and content based on the profile of the user visiting it. Adaptive Websites are described in more detail in [Perkowitz & Etzioni. 1998].

16. System Improvement The performance and service of Websites can be improved using knowledge of the Web traffic in order to predict the navigation path of the current user. This may be used e.g. for cashing, load balancing or data distribution to improve the performance. The path prediction can also be used to detect fraud, break-ins, intrusion etc. [Srivastava, 1999].

17. Web Personalization Web Personalization is an attractive application area for Web based companies, allowing for recommendations, marketing campaigns etc. to be specifically customized for different categories of users, and more importantly to do this in real-time, automatically, as the user accesses the Website. For example, [Mobasher et. al. 1999] and [Yan et al. 1996] uses association rules and clustering for grouping users and discover the type of user currently accessing the Website (based of the user’s path through the Website), in real-time, to dynamically adapt hyperlinks and content of the Website.

18. Business Intelligence For Web based companies Web mining is a powerful tool to collect business intelligence to get competitive advantages. Patterns of the customers’ activities on the Website can be used as important knowledge in the decision-making process, e.g. predicting customers’ future behaviour, recruiting new customers and developing new products are beneficial choices. There are many companies providing (among other things) services in the field of Web Mining and Web traffic analysis for extracting business intelligence, e.g.[BizInetl, 2011] and [WebTrends, 2011].

 19. Summary

01. Web is huge and dynamic
02. Web mining makes use of data mining techniques to automatically discover and extract information from Web documents/services 
03. Web content mining
04. Web structure mining
05. Web usage mining
06. Semantic web: "The Semantic Web is an extension of the current web in which information is given well-defined meaning, better enabling computers and people to work in cooperation." -- Tim Berners-Lee, James Hendler, Ora Lassila.

       Thanks For read My Article.Any Query Comment Below.

Introduction to Search Engine Optimization

Introduction to Search Engine Optimization


01.What is SEO? 
      Search engine optimization (SEO) refers to techniques that help your website rank higher in organic (or “natural”) search results, thus making your website more visible to people who are looking for your product or service via search engines.SEO is part of the broader topic of Search Engine Marketing (SEM), a term used to describe all marketing strategies for search. SEM entails both organic and paid search.With paid search, you can pay to list your website on a search engine so that your website shows up when someone types in a specific keyword or phrase. Organic and paid listings both appear on the search engine, but they are displayed in different locations on the page. So, why is it important for your business‟ website to be listed on search engines? On Google alone, there are over 694,000 searches conducted every second.
      I Think about that. Every second that your website is not indexed on Google, you are potentially missing out on hundreds, if not thousands of opportunities for someone to visit your website, read your content, and potentially buy your product or service. Practicing SEO basics, as well as more advanced techniques after those, can drastically improve your website‟s ability to rank in the search engines and get found by your potential customers. What about paid search? Yes, you can pay to have your website listed on the search engines.
      However, running paid search campaigns can be quite costly if you don‟t know what you‟re doing. Not to mention, about 88% of search engine users never click on paid search ads anyway. Because the sole purpose of a search engine is to provide you with relevant and useful information, it is in everyone‟s best interest (for the search engine, the searcher, and you) to ensure that your website is listed in the organic search listings. In fact, it is probably best to stay away from paid search all together until you feel you have a firm grasp on SEO and what it takes to rank organically.

02.How Search Engines Work?
       Search engines have one objective – to provide you with the most relevant results possible in relation to your search query.
      If the search engine is successful in providing you with information that meets your needs, then you are a happy searcher. And happy searchers are more likely to come back to the same search engine time and time again because they are getting the results they need. In order for a search engine to be able to display results when a user types in a query, they need to have an archive of available information to choose from. Every search engine has proprietary methods for gathering and prioritizing website content. Regardless of the specific tactics or methods used, this process is called indexing. Search engines actually attempt to scan the entire online universe and index all the information so they can show it to you when you enter a search query.
      How do they do it? Every search engine has what are referred to as bots, or crawlers, that constantly scan the web, indexing websites for content and following links on each webpage to other webpages. If your website has not been indexed, it is impossible for your website to appear in the search results. Unless you are running a shady online business or trying to cheat your way to the top of the search engine results page (SERP), chances are your website has already been indexed. So, big search engines like Google, Bing, and Yahoo are constantly indexing hundreds of millions, if not billions, of webpages. How do they know what to show on the SERP when you enter a search query? The search engines consider two main areas when determining what your website is about and how to prioritize it.
      1. Content on your website: When indexing pages, the search engine bots scan each page of your website, looking for clues about what topics your website covers and scanning your website‟s back-end code for certain tags, descriptions, and instructions.
     
      2. Who’s linking to you: As the search engine bots scan webpages for indexing, they also look for links from other websites. The more inbound links a website has, the more influence or authority it has. Essentially, every inbound link counts as a vote for that website‟s content. Also, each inbound link holds different weight. For instance, a link from a highly authoritative website like The New York Times (nytimes.com) will give a website a bigger boost than a link from a small blog site. This boost is sometimes referred to as link juice. When a search query is entered, the search engine looks in its index for the most relevant information and displays the results on the SERP. The results are then listed in order of most relevant and authoritative. If you conduct the same search on different search engines, chances are you will see different results on the SERP. This is because each search engine uses a proprietary algorithm that considers multiple factors in order to determine what results to show in the SERP when a search query is entered.
     
      3.A few factors that a search engine algorithm may consider when deciding what information to show in the SERP include: a. Geographic location of the searcher b. Historical performance of a listing (clicks, bounce rates, etc.) c. Link quality (reciprocal vs. one-way) d. Webpage content (keywords, tags, pictures) e. Back end code or HTML of webpage f. Link type (social media sharing, link from media outlet, blog, etc.) With a 200B market cap, Google dominates the search engine market. Google became the leader by fundamentally revolutionizing the way search engines work and giving searchers better results with their advanced algorithm. With 64% market share, according to Compete, Inc., Google is still viewed as the primary innovator and master in the space. Before the days of Google (circa 1997), search engines relied solely on indexing web page content and considering factors like keyword density in order to determine what results to put at the top of the SERP. This approach gave way to what are referred to as black-hat SEO tactics, as website engineers began intentionally stuffing their webpages with keywords so they would rank at the top of the search engines, even if their webpages were completely irrelevant to the search result.
     
       4.What it Takes to Rank It is not difficult to get your website to index and even rank on the search engines. However, getting your website to rank for specific keywords can be tricky. There are essentially 3 elements that a search engine considers when determining where to list a website on the SERP: rank, authority, and relevance. Rank Rank is the position that your website physically falls in on the SERP when a specific search query is entered. If you are the first website in the organic section of the SERP (don‟t be confused by the paid ads at the very top), then your rank is 1. If your website is in the second position, your rank is 2, and so on.
      As discussed previously in How Search Engines Work, your rank is an indicator of how relevant and authoritative your website is in the eyes of the search engine, as it relates to the search query entered. Tracking how your website ranks for a specific keyword over time is a good way to determine if your SEO techniques are having an impact. However, since there are so many other factors beyond your control when it comes to ranking, do not obsess over it. If your website jumps 1-5 spots from time to time, that‟s to be expected. It‟s when you jump 10, 20, 30 spots up in the rankings that it makes sense to pat yourself on the back.

       5.Authority As previously discussed in the How Search Engines Work section, search engines determine how authoritative and credible a website.s content is by calculating how many inbound links (links from other websites) it has. However, the number of inbound links does not necessarily correlate with higher rankings. The search engines also look at how authoritative the websites that link to you are, what anchor text is used to link to your website, and other factors such as the age of your domain. You can track over time how authoritative your website is by monitoring a few different metrics. There are a variety of tools to help you keep track. HubSpot offers a free tool called Website Grader that will show you how many domains are linking to your website, and also provide your website. Moz rank. MozRank is SEOmoz's general, logarithmically scaled 10-point measure of global link authority or popularity. It is very similar in purpose to the measures of link importance used by the search engines (e.g., Google's PageRank).
      6.Relevance Relevance is a one of the most critical factors of SEO. The search engines are not only looking to see that you are using certain keywords, but they are also looking for clues to determine how relevant your content is to a specific search query. Besides actual text on your webpages, the search engines will review your website‟s structure, use of keywords in your URLs, page formatting (such as bolded text), and what keywords are in the headline of the webpage versus those in the body text. While there is no way to track how relevant your website is, there are some SEO basics you can practice to cover your bases and make sure you are giving the search engines every possible opportunity to consider your website.
      We‟ll get to that in just a bit. Search engines are extremely complex. Bottom line: the search engines are trying to think like human beings. It is very easy to get caught up in modifying your website‟s content just so you rank on the search engines. When in doubt, always err on the side of providing relevant and coherent content that your website‟s audience (your prospects) can digest. If you find yourself doing something solely for the search engines, you should take a moment to ask yourself why.
      7.Content is King We‟ve all heard it - when it comes to SEO, content is king. Without rich content, you will find it difficult to rank for specific keywords and drive traffic to your website. Additionally, if your content does not provide value or engage users, you will be far less likely to drive leads and customers. It is impossible to predict how people will search for content and exactly what keywords they are going to use. The only way to combat this is to generate content and lots of it. The more content and webpages you publish, the more chances you have at ranking on the search engines. Lottery tickets are a good analogy here. The more lottery tickets you have, the higher the odds are that you will win.
      Imagine that every webpage you create is a lottery ticket. The more webpages you have, the higher your chances are of ranking in the search engines. As you already know, the search engines are smart. If you create multiple webpages about the same exact topic, you are wasting your time. You need to create lots of content that covers lots of topics. There are multiple ways you can use content to expand your online presence and increase your chances of ranking without being repetitive. Here are few examples: Homepage: Use your homepage to cover your overall value proposition and high-level messaging.
      If there was ever a place to optimize for more generic keywords, it is your homepage. Product/Service Pages: If you offer products and/or services, create a unique webpage for each one of them. Resource Center: Provide a webpage that offers links to other places on your website that cover education, advice, and tips.
     
      Blog: Blogging is an incredible way to stay current and fresh while making it easy to generate tons of content. Blogging on a regular basis (once per week is ideal) can have a dramatic impact on SEO because every blog post is a new webpage. While conducting SEO research, you may come across articles that discuss being mindful of keyword density (how often you mention a keyword on a page). Although following an approach like this may seem technically sound, it is not recommended. Remember: do not write content for the search engines. Write content for your audience and everything else will follow.
      Make sure each webpage has a clear objective and remains focused on one topic, and you will do just fine. How to Approach Your SEO Strategy When developing an SEO strategy, it is best to split your initiatives into two buckets: on-page SEO and off-page SEO. On-page SEO covers everything you can control on each specific webpage and across your website to make it easy for the search engines to find, index, and understand the topical nature of your content. Off-page SEO covers all aspects of SEO that happen off your website to garner quality inbound links. Let‟s dive into on-page SEO first, and then we‟ll tackle off-page SEO in the next section.
     
      Thanks For read My Article.Any Query Comment Below.

Web Hosting

An Introduction What is Web Hosting



1. What is web hosting?
Web hosting simply means internet hosting that enables businesses and individuals to make their online presence in the form of a website, accessible to the public via internet. A website needs two things to be hosted or to become accessible to everyone: Web space: Website files, HTML codes, images and everything else is stored in this space. The heavier your website, the more is the space you require to store its content.
Bandwidth: In the web hosting industry, bandwidth refers to the amount of data that can be transferred to and from a server or a website.
It is the allotted internet bandwidth that makes a website accessible to everyone online. The more the bandwidth, the better and faster is your network, connection and system. Bandwidth requirement is directly proportional to the number of visitors who visit a site. The more the number of visitors, the more is the bandwidth that is required. The requirement of space and bandwidth is fulfilled by a web hosting provider. However,in addition to these two, the hosting provider also maintains the server, ensures website uptime and provides data security.

2. Types of hosting: Depending upon your web space and bandwidth requirements, you can purchase the required hosting from three basic types of hosting available: Shared Hosting: In this hosting, multiple accounts are hosted on the same server and the resources are also shared among them. It is best for small businesses, whose websites have low to moderate traffic and CPU needs. It is like an apartment building, where you need to pay less amount, but you share the space with many other people. You also get affected if your neighbor decides to party as he and his visitors may create disturbance for you or may use up your parking space. Similarly, in shared hosting, if one user uses more bandwidth in case his site has more traffic, then your site may go down.
3. Dedicated Server: In this hosting, the owner has the complete server and all the resources exclusively for himself. However, the convenience and solitude makes it the costliest option in hosting. It is ideal for large enterprises and those organizations, whose websites have heavy traffic and CPU needs. It is like having your own home where you can live according to your convenience, but you will also have to bear the costs for its purchase and maintenance alone. Similarly, in dedicated hosting, you will have to pay more than any other type of hosting for availing server, bandwidth and resources all by yourself.
4. VPS Hosting: In this hosting, virtualization technology is used to partition a computer virtually into multiple servers. No physical partition is there, but because of virtual or software partition, each user is given much more privacy and security as compared to the shared hosting.
It is like living in a comfortable condo, where you get your own privacy, fewer neighbors and better space but you still share your walls and plot with others. The price for added space is obviously more than apartment housing, but is not as exorbitant as the cost of having your own home. Similarly, in VPS, you are less affected with busy websites and the cost is more than shared hosting and lesser than dedicated hosting.
VPS hosting also gives you the freedom to install whatever you want on the server. You can try any new programming language or deploy a custom Apache module. As you have the root access, you can make any changes using the remote desktop command line or remote desktop. VPS hosting offers many more advantages than shared hosting and almost the same advantages as dedicated hosting. But in terms of price, VPS hosting is less expensive than dedicated hosting but more expensive than shared hosting.

How do I choose the best hosting company?
When choosing a hosting company, there are certain ingredients that can make the difference between online success and an internet disaster. Do you want to be able to contact your hosting company at 10pm if your site and email develop problems?

Do you know how secure and stable the providers hosting setup is? Do you want to deal with someone who speaks English instead of technical jargon? There are a number of small hosting businesses set up in garages and basements or with limited staff resources. They may be cheaper, but the smaller the company, the fewer resources they have available to provide adequate customer support or servicing when you need it. It is not unknown for websites to go down only to discover that the person responsible for fixing the server is on holiday or otherwise unavailable. Every time your website is down, your business loses money, so compromises on hosting prices can cost you in other ways. The most reliable hosting companies provide dedicated customer support and technical services, capable of dealing with your issues day and night.
If your website goes offline, you don’t want to wait a couple of days before discovering the problem as you want to be sure your provider will identify and correct the issue before you even know about it. Also, although it is possible to host your website anywhere in the world, choosing a local hosting company can have greater advantages. Dealing with a hosting company in the United States can be frustrating.
Long distance calls are expensive and email support can often be slow and unhelpful outside of US business hours. If your website is suffering costly delays during Australian business hours, you don’t want to wait for New York to wake up before it can be fixed. A reputable hosting provider should have enough back-up safeguards to ensure your website never goes down (well, at least 99.99% of the time).
The last thing you want is the server with your website on it floating through a flooded building with no appropriate back-up stored high-and-dry elsewhere. The same goes for the connections to the web. If the local road works chop through a Telstra cable, wouldn’t you feel better knowing your server was also linked to the net through at least one or two other connections, providing uninterrupted service? Make sure you know how your provider can guarantee your online store will remain open for business 24x7.
Net registry houses servers in the largest data centre in the southern hemisphere, Global Switch in Pyrmont, as well as the E3 data centre in Alexandria, providing reliable, secure, and above all, local hosting. How much uptime is good enough? The last thing your website needs is downtime, which is why many hosting providers offer uptime guarantees. But how much is good enough? The difference between 99.5% and 99.9% uptime might not seem very much, but actually equates to 42.9 hours of your website being unavailable to customers each year.
That is nearly two days of forced closure with no sales and no money coming in. Therefore, 0.4% of difference can cost your business a lot more than you may think. What do I need to consider when choosing a hosting package? There are a number of factors to consider when deciding on the best value hosting solution for your website. It is important to not only understand your current goals, but have an idea for how the website is likely to evolve.
In choosing a hosting plan, there are a few key considerations. Data storage, data transfer, bandwidth, databases and scripting technology. Most hosting accounts come equipped with features such as email. Even so, there are some that consider email an added extra for a fee. Check for the features you need beforehand so you have a clear idea of the total cost of the package. How much data storage is necessary? Most hosting plans contain more than enough data storage for the majority of websites.
But certain files take up more space than others. Lots of image, video or audio files can chew up storage space quickly. It is possible to reduce the size of many large files, so it is worth working with someone with the technical ability to compress the amount of data without compromising quality. If your website is likely to grow over time, this needs to be accounted for.
An extreme example would be a site like YouTube, needing to find storage for thousands of large video files every day. You will probably never be faced with that kind of growth, but even a few large files added regularly over time can soon eat up your storage space. Alternatively, simple text pages with only a few small images take relatively little storage space. You may be able to add hundreds of these pages before filling the space taken by a handful of audio or video files. Where in the world is your web server? Do you know where your hosting company is storing your data? Most Australian hosting is actually stored offshore, although you wouldn’t know it. Storing your website overseas can also impact your website by being slower and being less responsive.
Net registry is 100% Australian owned and run, with the entire hosting infrastructure located and maintained in Sydney. What is bandwidth? Bandwidth is the are that connects your website to the internet. It is a measure of the amount of digital information that can be accessed within a given time period.
For example, if your main webpage is 300 kilobytes in size, then every time a web browser accesses it, 300kb of data travels down the ‘pipe’. How fast this data is transferred depends on how many people are accessing data down the pipe at the same time and how large your pipe is. Larger files, such as audio and video, eat up bandwidth just as they eat up data storage.
Putting a lot of large files in your website can clog the flow of data through the are and make your website much slower to load in a visitors browser if a lot of people access that content at once. Even with small files, your bandwidth can still become bottle-necked. If your data connection can cope with a certain amount of data transfer per second, but the number of people trying to access your website exceeds this, it can also cause a traffic jam. With the data not getting through fast enough, website visitors can receive are timed out error messages instead of your wonderful webpage. If too much traffic tries to access data through this connection at the same time, it is possible to crash the server, putting your website offline until the problem can be fixed. This can make some websites a victim of their own success, with their website becoming inaccessible when they are at their most popular.
If you are expecting a large spike in traffic or an increase in large files, it is worth talking to your web host about bandwidth solutions. This could mean a transfer to a different plan or short term strategies such as website mirrors (a duplicate of the website on a separate hosting server). What is clustered hosting? Hosting usually requires websites to share server space on the one unit, but imagine how upset you would be if the actions of one website impacted on the performance of your business because it shares the same server.
If a website receives too much traffic or overloads (crashes) the server through increased activity, it affects everything stored in the same place. Clustered hosting spreads the load across multiple machines so that no one website can affect any other. This allows for a far more secure and stable hosting environment with fewer risks. Always choose a hosting provider that offers clustered hosting to be sure that your website remains operating at its best.

Should I worry about data transfer limits?
Hosting accounts usually have a data transfer limit. This is the amount of data the hosting package can provide to the internet within the given time period (usually a month). This is similar to you home internet account,
for example, where you pay so much for your connection with a fixed download limit. If you go over your monthly limit, depending on your provider, you may find your internet speed throttled to a crawl or be charged extra for every additional megabyte of data. If either of these outcomes happened to your website, it is not an ideal outcome. Your website could become slow to load or you could receive a larger bill to pay for the extra data transfer. Thankfully, you don’t need to worry. All Netregistry hosting accounts have unlimited data transfer, no throttling and no additional data fees.
What is database technology?
More and more websites now allow visitors to interact and manipulate information on the webpage. This may be by entering and registering their information, performing searches to be presented with a page of specific results, or entering comments into a blog or forum, to mention just three examples. These websites are called are dynamic, and use database technology to store information in sections that can be reassembled into fresh webpages in answer to the visitor request. Classic examples of dynamic websites are blogs, eBay or any site that allows a user to sign in and create a profile. Also, any website that uses a content management system or shopping cart works in the same way. If you plan to include any of these features on your website, you will need to check for database technology on your hosting server.
An example is Netregistrys Business Hosting, providing the most commonly used database features for small business. The database is also bounded by storage limits, and it is worth knowing what these are before planning a complex website. If you don’t plan to use any dynamic features, you may only need basic are statichosting, such as Netregistrys Economy Hosting package, saving you money. Of course, if at any time you decide to develop the features on your website, it is possible to upgrade your hosting plan to the correct configuration. What is Scripting? The most common language used to code web pages is called HyperText Markup Language (HTML).
This code tells a web browser how to display the webpage. But when creating dynamic webpages, some additional scripting languages are needed to tell the web browser where to access the information needed to construct the elements on the page. Because your hosting server will need to interpret these scripts to provide the correct responses from the database, you need to be sure it has been configured with the relevant languages.

Common scripting languages are PHP, Perl, ASP, etc. Unless you are building the website yourself, it is unlikely you will need to know anything about these languages. Your web designer will be able to tell you which particular scripts to look for when choosing a hosting package, but most hosting servers with database technology support the majority of commonly used scripts. Where can I get more advice? You may now understand some of the basics, but relating them to your specific situation can sometimes require experience and further knowledge. If you are still unsure how to choose the best hosting plan for your website, the Net registry sales team is trained in determining your specific needs. The best hosting plan is one that doesn’t require regular attention or additional monthly fees, but can cope with the daily demands of your website without complaint. By addressing the key principles, you can ask the right questions of your hosting provider and give your website the best platform to reach your audience.
Net registry has been providing strong hosting advice and solutions to Australian businesses since 1997. With static, dynamic and e-commerce hosting solutions available, Net registry can provide the reliability and features you need to form the foundations for your new online empire.

SSL CERTIFICATE

WHY YOU NEED AN SSL CERTIFICATE 


                 Introduction Recent numbers from the U.S. Department of Commerce show that online retail is continuing its rapid growth. However, malicious phishing and pharming schemes and fear of inadequate online security cause online retailers to lose out on business as potential customers balk at doing business online, worrying that sensitive data will be abused or compromised. For e-businesses, the key is to build trust: Running a successful online business requires that your customers trust that your business effectively protects their sensitive information from intrusion and tampering. Installing an SSL Certificate from Starfield Technologies on your e-commerce Web site allows you to secure your online business and build customer confidence by securing all online transactions with up to 256-bit encryption.
                An SSL Certificate on your business’ Web site will ensure that sensitive data is kept safe from prying eyes. With a Starfield Technologies SSL Certificate, customers can trust your site. Before issuing a certificate, Starfield Technologies rigorously authenticates the requestor’s domain control and, in the case of High Assurance SSL Certificates, the identity and, if applicable, the business records of the certificate-requesting entity. The authentication process ensures that customers and business partners can rest assured that a Web site protected with a Starfield Technologies certificate can be trusted.
               A Starfield Technologies SSL Certificate provides the security your business needs and the protection your customers deserve. With a Starfield Technologies SSL Certificate, customers will know that your site is secure. Why You Need a Starfield Technologies SSL Certificate In the rapidly expanding world of electronic commerce, security is paramount. Despite booming Internet sales, widespread consumer fear that Internet shopping is not secure still keeps millions of potential shoppers from buying online.
               Only if your customers trust that their credit card numbers and personal information will be kept safe from tampering can you run a successful online business. For online retailers, securing their shopping sites is paramount. If consumers perceive that their credit card information might be compromised online, they are unlikely to do their shopping on the Internet. A Starfield Technologies SSL Certificate provides an easy, cost-effective and secure means to protect customer information and build trust.
               An SSL Certificate enables Secure Sockets Layer (SSL) encryption of your business’ online transactions, allowing you to build an impenetrable fortress around your customers’ credit card information.
 

Starfield Technologies SSL Certificates offer industry-leading security and versatility: 

1. Fully validated
2. Up to 256-bit encryption
3. One-, two- or three-year validity (Turbo SSL Certificates valid up to 10 years)
4. 99% percent browser recognition
5. Stringent authentication
6. Around-the-clock customer support A Starfield Technologies SSL Certificate helps you build an impenetrable fortress around your customers’ credit card information.

What is an SSL Certificate? 

An SSL certificate is a digital certificate that authenticates the identity of a Web site to visiting browsers and encrypts information for the server via Secure Sockets Layer (SSL) technology. A certificate serves as an electronic “passport” that establishes an online entity’s credentials when doing business on the Web. When an Internet user attempts to send confidential information to a Web server, the user’s browser will access the server’s digital certificate and establish a secure connection.

          Information contained in the certificate includes: n The certificate holder’s name (individual or company)* n The certificate’s serial number and expiration date n Copy of the certificate holder’s public key n The digital signature of the certificate-issuing authority To obtain an SSL certificate, one must generate and submit a Certificate Signing Request (CSR) to a trusted Certification Authority, such as Starfield Technologies, which will authenticate the requestor’s identity, existence and domain registration ownership before issuing a certificate.
          Public and Private Keys When you create a CSR, the Web server software with which the request is being generated, creates two unique cryptographic keys: A public key, which is used to encrypt messages to your (i.e., the certificate holder’s) server and is contained in your certificate, and a private key, which is stored on your local computer and “decrypts” the secure messages so they can be read by your server.
          In order to establish an encrypted link between your Web site and your customer’s Web browser your Web server will match your issued SSL certificate to your private key. Because only the Web server has access to its private key, only the server can decrypt SSL-encrypted data. *High Assurance Certificates only. Turbo SSL Certificates only contain the domain name and no information on the individual or company that purchased the certificate. Enabling Safe and Convenient Online Shopping
           A Starfield Technologies SSL Certificate secures safe, easy and convenient Internet shopping. Once an Internet user enters a secure area — by entering credit card information, e-mail address or other personal data, for example — the shopping site’s SSL certificate enables the browser and Web server to build a secure, encrypted connection. The SSL “handshake” process, which establishes the secure session, takes place discreetly behind the scenes, ensuring an uninterrupted shopping experience for the consumer.
           A “padlock” icon in the browser’s status bar and the “https://” prefix in the URL are the only visible indications of a secure session in progress. By contrast, if a user attempts to submit personal information to an unsecured Web site (i.e., a site that is not protected with a valid SSL certificate), the browser’s built-in security mechanism will trigger a warning to the user, reminding him/her that the site is not secure and that sensitive data might be intercepted by third parties. Faced with such a warning, most Internet users likely will look elsewhere to make a purchase.
           Up to 256-Bit Encryption Starfield Technologies SSL certificates support both industry-standard 128-bit (used by all banking infrastructures to safeguard sensitive data) and high-grade 256-bit SSL encryption to secure online transactions. The actual encryption strength on a secure connection using a digital certificate is determined by the level of encryption supported by the user's browser and the server that the Web site resides on. For example, the combination of a Firefox browser and an Apache 2.X Web server enables up to 256-bit AES encryption with Starfield Technologies certificates.
          Encryption strength is measured in key length — number of bits in the key. To decipher an SSL communication, one needs to generate the correct decoding key. Mathematically speaking, 2n possible values exist for an n-bit key. Thus, 40-bit encryption involves 240 possible values. 128- and 256-bit keys involve a staggering 2128 and 2256 possible combinations, respectively, rendering the encrypted data de facto impervious to intrusion. Even with a brute-force attack (the process of systematically trying all possible combinations until the right one is found) cracking a 128- or 256-bit encryption is computationally unfeasible. Stringent Authentication — A Matter of Trust Before Starfield Technologies issues an SSL Certificate, the applicant’s company or personal information undergoes a rigorous authentication procedure that serves to pre-empt online theft and to verify the domain control and, if applicable, the existence and identity of the requesting entity.
         Only through thorough validation of submitted data can the online customer rest assured that online businesses that utilize SSL certificates from Starfield Technologies indeed are to be trusted. SSL Certificates are only issued to entities whose domain control and, depending on certificate type, business credentials and contact information have been verified. Thus, a Starfield Technologies SSL certificate guarantees that the entity that owns the certificate is who it claims to be and has a legal right to use the domain from which it operates.

Starfield Technologies issues three types of SSL Certificates, each of which relies on authentication of a number of elements: High Assurance Certificate — Corporate: Starfield Technologies will authenticate that:
1. The certificate is being issued to an organization that is currently registered with a government authority.
2. The requesting entity controls the domain in the request.
3. The requesting entity is associated with the organization named in the certificate. High Assurance Certificate — Small Business/Sole Proprietor: Starfield Technologies will authenticate that:
4. The individual named in the certificate is the individual who requested the certificate.
5. The requesting individual controls the domain in the request. Starfield Technologies will authenticate that:
6. The requesting entity controls the domain in the request. Phishing and Pharming — How SSL Can Help Phishing and, recently, pharming pose constant threats to Internet users whose sensitive information is under siege by crackers and other cyber crooks.
         
             An SSL certificate from Starfield Technologies can clip the wings of Internet criminals and help prevent Internet users from being victimized by phishing and pharming schemes when attempting to visit your Web site. Phishing schemes – attempts to steal and exploit sensitive personal information – typically try to trick victims into accessing fraudulent sites that pose as legitimate, trusted entities, such as online businesses and banks. Because perpetrators of such attacks will be using and registering domains that resemble those of the spoofed sites, Starfield Technologies, through its stringent fraud-prevention measures, will detect the schemes and deny certificate requests for suspicious domains.
           More sophisticated than phishing, pharming revolves around the concept of hijacking an Internet Service Provider’s (ISP) domain name server (DNS) entries. When a “pharmer” succeeds in such DNS “poisoning” every computer using that ISP for Internet access is directed to the wrong site when the user types in a URL (e.g., www.ebay.com). SSL certificate technology can help prevent pharming attacks, as well. In essence, a “pharmer” simply will not be able to obtain an SSL certificate from Starfield Technologies, as he/she does not control the domain for which the certificate is requested.
           By protecting your Web site with a Starfield Technologies SSL certificate Internet users that attempt to access a site that poses as yours will be instantly alerted that there is a problem with the supposedly secure connection:
1. No lock icon: Because CAs usually won’t issue a certificate to fraudulent phishing or pharming sites, such sites usually do not use SSL encryption. Internet users, therefore, are alerted by the absence of a padlock icon in their browser’s status bar.
2. Name mismatch error: A pharming site could try to use a certificate issued by a CA for a domain owned by the attacker, but the user’s browser will warn the user that the visited URL does not match the certificate presented by the fake Web server.
3. Untrusted CA: A pharming site might attempt to use a certificate issued by an untrusted CA. In this case, the user’s browser will generate the following warning: “the security certificate was issued by a company you have not chosen to trust.” The alert Internet user will instantly abandon his/her activities/ transactions when presented with such warnings. Thus, a Starfield Technologies SSL certificate provides business owners and wary, savvy Internet users with an effective weapon against phishing, pharming and similar cyber swindles.
Establishing a Secure Connection — How SSL Works An SSL-encrypted connection is established via the SSL “handshake” process, which transpires within seconds — transparently to the end user.

In essence, the SSL “handshake” works thus:
1. When accessing an SSL-secured Web site area, the visitor’s browser requests a secure session from the Web server.
2. The server responds by sending the visitor’s browser its server certificate.
3. The browser verifies that the server’s certificate is valid, is being used by the Web site for which it has been issued, and has been issued by a Certificate Authority that the browser trusts.
4. If the certificate is validated, the browser generates a one-time “session” key and encrypts it with the server’s public key.
5. The visitor’s browser sends the encrypted session key to the server so that both server and browser have a copy.
6. The server decrypts the session key using its private key.
7. The SSL “handshake” process is complete, and an SSL connection has been established.
           A padlock icon appears in the browser’s status bar, indicating that a secure session is under way. Conclusion — The Key to Online Security Demand for reliable online security is increasing. Despite booming online sales many consumers continue to believe that shopping online is less safe than doing so at old-fashioned brick-and-mortar stores.
          The key to establishing a successful online business is to build customer trust. Only when potential customers trust that their credit card information and personal data is safe with your business, will they consider making purchases on the Internet.
         
Thanks For read My Article.Any Query Comment Below.

Artificial Intelligent-IV

Artificial Intelligent-IV Hello ,                So    we have go forward to learn new about Artificial Intelligent S...