Pipelining is a process in which the data is accessed in a stage by stage process. The data is accessed in a sequence that is each stage performs an operation. If there are n number of stages then n number of operations is done. To increase the throughput of the processing network the pipe lining process is done. This method is adopted because the operation or the data is accessed in a sequence with a fast mode.
The .NET Framework provides a run-time environment called the Common Language Runtime, which manages the execution of code and provides services that make the development process easier. Compilers and tools expose the runtime's functionality and enable you to write code that benefits from this managed execution environment. Code that you develop with a language compiler that targets the runtime is called managed code; it benefits from features such as cross-language integration, cross-language exception handling, enhanced security, versioning and deployment support, a simplified model for component interaction, and debugging and profiling services
Generally Cache is using to imporve the performace of site.. We have two places where we can use caching..
1. On Page Level...
2. Data level Caching for saving Dabase hit.
This Memory is used extending the capability of physical memory. This memory is simulated by the hard drive.When all the RAM is being used the computer will swap data to the hard drive and back to give the impression that there is more memory
caching is often considered as a performance-enhancement tool than a way to to store application data.If u spend more server resources in accesing the same data repeatedly,use caching instead.Caching data can bring huge performance benefits ,so whenever u find that u need to frequently access data that does'nt often change,cache it in the cache object and your application's performance will improve.
In simple terms, Interrupt are come from all hardware to indicate to the CPU that all hardware are live and work properly and it's an only way by which mode switching i.e. from User mode to kernel mode is done.
MESI is a Cache Coherency protocol used in multi-processor systems to indicate the state in which the data in the cache of a particular processor is. It stands of Modified, Exclusive, Shared and Invalid
Pipeline is a process where a business object goes through several stages asynchronously. Where one stage picks up processes and drops it for the next process to pick up. The hazard is when the a different thread of the same process picks up the business object leads to malfunction. This can be handled by status handling or scan delays.
Level 1 cache is internal to the chip, L2 is external. L1 Cache is of Higher speed than that of L2 Cache.
Memory Management is a crucial role in every operating system. Memory management is there are many types such as 1. Storage memory Management 2. I/O Memory Management etc ..
Search our portal for more details
12. Explain The number or character entered through keyboard gets converted to equivalent ASCII code & it get stored on RAM in the binary form. What is the exact procedure
on hardware that converts the ASCII value to Binary?
Caching is done with so many ways like page caching, output caching etc. With ASP.NET we can achieve these effiecienty and handled by the CLR, which shows a direct impact on the Caching feature in .NET technology.
In primary storage device the storage capacity is limited. It has a volatile memory. In secondary storage device the storage capacity is larger. It is a nonvolatile memory. Primary devices are: RAM / ROM. Secondary devices are: Floppy disc / Hard disk.
DMA is about hardware architecture, so DMA stands for Direct Memory Access -- transferring the content of the memory to a I/O device (or back) without using the processor.
The Record Management System (RMS) is a simple record-oriented database that allows a MIDlet to persistently store information and retrieve it later. Different MIDlets can also use the RMS to share data.
Hard disk is the secondary storage device, which holds the data in bulk, and it holds the data on the magnetic medium of the disk.Hard disks have a hard platter that holds the magnetic medium, the magnetic medium can be easily erased and rewritten, and a typical desktop machine will have a hard disk with a capacity of between 10 and 40 gigabytes. Data is stored onto the disk in the form of files.
Race conditions is a severe way crashing the server/ system at times. Generally this problem arises in priority less systems or the users who has eqal priority will be put to this problem. Race condition is a situation in which a resource D is to be serviced to a process A and the processB which holds the resoure C is to be given to the process A. So a cyclic chain occurs and no way the resources will be get shared and also the systems with equal prirority wont get the resoure so that the system wont come out of the blocked state due to race condition!
Race condition is a bug in your application, occurs when the result of your application depends on which one of two or more threads reaches a shared block of code first. In this case, the application output changes each time it is executed!
As an example; assume that we have a shared integer object called x, and we have two threads 1, and 2. Thread number 1 attempt to increment the x object by one, and during this increment process, its time slice has been finished. Thread 2 time slice just start and it attempt to increment the same x object too. Thread 2 incremented the x object successfully, and then its time slice finished. Thread 1 starts a new time slice and completing the increment process not knowing that the object x value is already changed. This is a race condition, and the output of such code is of course incorrect!
The above race condition problem can be solved by using an object like "InterLock", with its "Increment", and "Decrement" methods.
Race conditions can be avoided generally by considering each line of code you write, and asking yourself: What might happen if a thread finished before executing this line? or during executing this line? and another thread overtook it?
The settings made in the web.config file are applied to that particular web application only whereas the settings of machine.config file are applied to the whole asp.net application.
In-process mode simply means using ASP.NET session state in a similar manner to classic ASP session state. That is, session state is managed in process and if the process is re-cycled, state is lost. Given the new settings that ASP.NET provides, you might wonder why you would ever use this mode. The reasoning is quite simple: performance. The performance of session state, e.g. the time it takes to read from and write to the session state dictionary, will be much faster when the memory read to and from is in process, as cross-process calls add overhead when data is marshaled back and forth or possibly read from SQL Server.
In-process mode is the default setting for ASP.NET. When this setting is used, the only other session config.web settings used are cookieless and timeout.
If we call SessionState.aspx, set a session state value, and stop and start the ASP.NET process (iisreset), the value set before the process was cycled will be lost.
StateServer (Out-of-process Mode):
Included with the .NET SDK is a Windows? NT service: ASPState. This Windows service is what ASP.NET uses for out-of-process session state management.
ASP.NET session state supports several different storage options for session data. Each option is identified by a value in the SessionStateMode enumeration. The following list describes the available session state modes:
*In-process mode is the default session state mode and is specified using the InProc SessionStateMode enumeration value. In-process mode stores session state values and variables in memory on the local Web server. It is the only mode that supports the Session_OnEnd event. For more information about the Session_OnEnd event, see Session-State Events.
*StateServer mode stores session state in a process, referred to as the ASP.NET state service, that is separate from the ASP.NET worker process or IIS application pool. Using this mode ensures that session state is preserved if the Web application is restarted and also makes session state available to multiple Web servers in a Web farm.
To use StateServer mode, you must first be sure the ASP.NET state service is running on the server used for the session store. The ASP.NET state service is installed as a service when ASP.NET and the .NET Framework are installed. The ASP.Net state service is installed at the following location:
To configure an ASP.NET application to use StateServer mode, in the application's Web.config file do the following:
*Set the mode attribute of the sessionState element to StateServer.
*Set the stateConnectionString attribute to tcpip=serverName:42424.
*SQLServer mode stores session state in a SQL Server database. Using this mode ensures that session state is preserved if the Web application is restarted and also makes session state available to multiple Web servers in a Web farm.
To use SQLServer mode, you must first be sure the ASP.NET session state database is installed on SQL Server. You can install the ASP.NET session state database using the Aspnet_regsql.exe tool, as described later in this topic.
To configure an ASP.NET application to use SQLServer mode, do the following in the application's Web.config file:
*Set the mode attribute of the sessionState element to SQLServer.
*Set the sqlConnectionString attribute to a connection string for your SQL Server database.
*Custom mode, which enables you to specify a custom storage provider.
*Off mode, which disables session state.
In the initial 1.0 release of ASP.NET, you had no choice about how to transmit the session token between requests when your Web application needed to maintain session state: it was always stored in a cookie. Unfortunately, this meant that users who would not accept cookies could not use your application. So, in ASP.NET 1.1, Microsoft added support for cookieless session tokens via use of the "cookieless" setting.
Web applications configured to use cookieless session state now stored the session token in the page URLs rather than a cookie. For example, the page URL might change from http://myserver/MyApplication/default.aspx to http://myserver/MyApplication/(123456789ABCDEFG)/default.aspx. In this case, "123456789ABCDEFG" represents the current user's session token. A different user browsing the site at the same time would receive a completely different session token, resulting in a different URL, such as http://myserver/MyApplication/(ZYXWVU987654321)/default.aspx.
Graphical user interface programming is inherently more complex than ordinary applications programming because the graphical interface computation is driven by a stream of graphical input actions. All of the input actions performed by a program user including moving the mouse, clicking a mouse button, and typing a keystroke are processed by code in the computer operating system. This code determines when an input action of potential interest to the application occurs. Such an input action is called an ``event''. Typically mouse movement alone does not constitute an event; the operating system updates the position of the cursor on the screen as the mouse is moved. When a mouse button is clicked or a key is typed, the operating system interrupts the application program and informs it that the specified event has occurred.
Cache memory is random access memory (RAM) that a computer microprocessor can access more quickly than it can access regular RAM. As the microprocessor processes data, it looks first in the cache memory and if it finds the data there (from a previous reading of data), it does not have to do the more time-consuming reading of data from larger memory.(cache memory is used between the c.p.u and the ram to access dta fastly)
Architecture refers most directly to the built environment,
the structures humans create and occupy. While buildings
are one type of architecture artifact, other objects also
document the built environment. These objects include
photographs, drawings, or paintings of buildings. They also
may be blueprints, building codes, furnishings, or written
descriptions of physical spaces (such as architectural
guidebooks or decorating manuals).
Instruction Fetch Stage Instruction Decode Stage Instruction Execution Stage Memory Stage Write Back
Physically Cache is a part of storage area in RAM which allocates and deallocates the frequently used information based on the different methods like (LIFO, FIFO, LRU etc.)