FreshPatents.com Logo
Enter keywords:  

Track companies' patents here: Public Companies RSS Feeds | RSS Feed Home Page
Popular terms

[SEARCH]

Follow us on Twitter
twitter icon@FreshPatents

Web & Computing
Cloud Computing
Ecommerce
Search patents
Smartphone patents
Social Media patents
Video patents
Website patents
Web Server
Android patents
Copyright patents
Database patents
Programming patents
Wearable Computing
Webcam patents

Web Companies
Apple patents
Google patents
Adobe patents
Ebay patents
Oracle patents
Yahoo patents

[SEARCH]

Cache patents



      
           
This page is updated frequently with new Cache-related patent applications. Subscribe to the Cache RSS feed to automatically get the update: related Cache RSS feeds. RSS updates for this page: Cache RSS RSS


Methods and systems for reducing churn in flash-based cache

Incremental batch method for transforming event-driven metrics and measures within a map/reduce data center

Access requests with cache intentions

Date/App# patent app List of recent Cache-related patents
08/28/14
20140245446
 Performing security operations using binary translation patent thumbnailnew patent Performing security operations using binary translation
In an embodiment, a processor includes a binary translation engine to receive a code segment, to generate a binary translation of the code segment, and to store the binary translation in a translation cache, where the binary translation includes at least one policy check routine to be executed during execution of the binary translation on behalf of a security agent. Other embodiments are described and claimed..
08/28/14
20140244976
 It instruction pre-decode patent thumbnailnew patent It instruction pre-decode
Various techniques for processing and pre-decoding branches within an it instruction block. Instructions are fetched and cached in an instruction cache, and pre-decode bits are generated to indicate the presence of an it instruction and the likely boundaries of the it instruction block.
08/28/14
20140244973
 Reconfigurable elements patent thumbnailnew patent Reconfigurable elements
The present invention provides for a multiprocessor device on either a chip or a stack of chips. The multiprocessor device includes a plurality of processing entities and a memory system.
08/28/14
20140244939
 Texture cache memory system of non-blocking for texture mapping pipeline and operation method of texture cache memory patent thumbnailnew patent Texture cache memory system of non-blocking for texture mapping pipeline and operation method of texture cache memory
A non-blocking texture cache memory for a texture mapping pipeline and an operation method of the non-blocking texture cache memory may include: a retry buffer configured to temporarily store result data according to a hit pipeline or a miss pipeline; a retry buffer lookup unit configured to look up the retry buffer in response to a texture request transferred from a processor; a verification unit configured to verify whether result data corresponding to the texture request is stored in the retry buffer as the lookup result; and an output control unit configured to output the stored result data to the processor when the result data corresponding to the texture request is stored as the verification result.. .
08/28/14
20140244938
 Method and apparatus for returning reads in the presence of partial data unavailability patent thumbnailnew patent Method and apparatus for returning reads in the presence of partial data unavailability
Techniques are disclosed for reducing perceived read latency. Upon receiving a read request with a scatter-gather array from a guest operating system running on a virtual machine (vm), an early read return virtualization (errv) component of a virtual machine monitor fills the scatter-gather array with data from a cache and data retrieved via input-output requests (ios) to media.
08/28/14
20140244936
 Maintaining cache coherency between storage controllers patent thumbnailnew patent Maintaining cache coherency between storage controllers
Systems and methods maintain cache coherency between storage controllers utilizing bitmap data. In one embodiment, a storage controller processes an i/o request for a logical volume from a host, and generates one or more cache entries in a cache memory that is based on the request.
08/28/14
20140244935
 Storage system capable of managing a plurality of snapshot families and method of snapshot family based read patent thumbnailnew patent Storage system capable of managing a plurality of snapshot families and method of snapshot family based read
A method for a snapshot family based reading of data units from a storage system, the method comprises: receiving a read request for reading a requested data entity, searching in a cache memory of the storage system for a matching cached data entity, if not finding the matching cached data entity then: searching for one or more relevant data entity candidates stored in the storage system; selecting, out of the one or more relevant data entity candidates, a selected relevant data entity that has a content that has a highest probability, out of contents of the one or more relevant data entity candidates, to be equal to the content of the requested data entity; and responding to the read request by sending the selected relevant data entity.. .
08/28/14
20140244934
 Storage apparatus patent thumbnailnew patent Storage apparatus
[solution] when data designated by a read request from a main frame is stored in a cache memory, a transfer control unit refers to internal control information, identifies the length of a key area and the length of a data area of the data designated by the read request, calculates an address of the data, which is designated by the read request, in the cache memory based on the identified length of the key area, the identified length of the data area, and the length of a count area which is a fixed length, and controls processing for collectively transferring the data, which is stored at the calculated address in the cache memory, from the cache memory to a channel adapter.. .
08/28/14
20140244933
 Way lookahead patent thumbnailnew patent Way lookahead
Methods and systems that identify and power up ways for future instructions are provided. A processor includes an n-way set associative cache and an instruction fetch unit.
08/28/14
20140244932
 Method and apparatus for caching and indexing victim pre-decode information patent thumbnailnew patent Method and apparatus for caching and indexing victim pre-decode information
The present invention provides a method and apparatus for caching pre-decode information. Some embodiments of the apparatus include a first pre-decode array configured to store pre-decode information for an instruction cache line that is resident in a first cache in response to the instruction cache line being evicted from one or more second cache(s).
08/28/14
20140244920
new patent Scheme to escalate requests with address conflicts
Techniques for escalating a real time agent's request that has an address conflict with a best effort agent's request. A best effort request can be allocated in a memory controller cache but can progress slowly in the memory system due to its low priority.
08/28/14
20140244918
new patent Methods and systems for reducing churn in flash-based cache
A storage device includes a flash memory-based cache for a hard disk-based storage device and a controller that is configured to limit the rate of cache updates through a variety of mechanisms, including determinations that the data is not likely to be read back from the storage device within a time period that justifies its storage in the cache, compressing data prior to its storage in the cache, precluding storage of sequentially-accessed data in the cache, and/or throttling storage of data to the cache within predetermined write periods and/or according to user instruction.. .
08/28/14
20140244917
new patent Methods and systems for reducing churn in flash-based cache
A storage device includes a flash memory-based cache for a hard disk-based storage device and a controller that is configured to limit the rate of cache updates through a variety of mechanisms, including determinations that the data is not likely to be read back from the storage device within a time period that justifies its storage in the cache, compressing data prior to its storage in the cache, precluding storage of sequentially-accessed data in the cache and/or throttling storage of data to the cache within predetermined write periods and/or according to user instruction.. .
08/28/14
20140244902
new patent Fast read in write-back cached memory
An apparatus having a cache and a circuit is disclosed. The cache includes a plurality of cache lines.
08/28/14
20140244898
new patent I/o hint framework for server flash cache
An i/o hint framework is provided. In one embodiment, a computer system can receive an i/o command originating from a virtual machine (vm), where the i/o command identifies a data block of a virtual disk.
08/28/14
20140244800
new patent Method for collecting online analytics data using server clusters
A visitor to a website is allocated to a server cluster which delivers content to the visitor. Online analytics data is collected by means of the servers of the server cluster, and the collected information is stored in a cluster cache database of the server cluster.
08/28/14
20140244799
new patent Installation of an asset from a cloud marketplace to a cloud server in a private network
A mechanism is provided in a data processing system for deploying of an asset from a marketplace to a computing behind an enterprise firewall. A grabber service in a cloud management computing device in a private network behind the enterprise firewall monitors a placeholder in a file system of a marketplace server.
08/28/14
20140244727
new patent Method and apparatus for streaming multimedia content of server by using cache
A technology for a client streaming content of a server is provided. The technology includes a service installed in the client that receives the content from the server instead of a media player, downloads the content in a cache having a larger capacity than a memory buffer of the media player, and then transfers data stored in the cache to the media player.
08/28/14
20140244725
new patent Dns outage avoidance method for recursive dns servers
This disclosure describes systems, methods, and apparatus to protect users of the internet from dns outages. In particular, an outage avoidance system is provided that includes query processing, outage avoidance processing, and a local cache, all configured to extend the ttl of expired answers to dns queries or to ignore the expiration of an answer's ttl value, and thereby provide the expired answer in response to a client request when a dns server is unable to obtain an answer from a remote dns server..
08/28/14
20140244688
new patent Access requests with cache intentions
A lease system is described herein that allows clients to request a lease to a remote file, wherein the lease permits access to the file across multiple applications using multiple handles without extra round trips to a server. When multiple applications on the same client (or multiple components of the same application) request access to the same file, the client specifies the same lease identifier to the server for each open request or may handle the request from the cache based on the existing lease.
08/28/14
20140244675
new patent Instantaneous incremental search user interface
An incremental search user interface is implemented to reduce search requests from a client system to a server system. In one aspect, a result list is cached in a memory of the client system, where the result list corresponds to a search request from the client system to the server system.
08/28/14
20140244619
new patent Intelligent data caching for typeahead search
Techniques for providing low latency incremental search results are disclosed herein. According to one embodiment, a method for incremental search includes receiving a first search query from a user, obtaining a plurality of first search results in response to the first search query from an index server, determining whether the plurality of first search results are a substantially exhausted list of results for the first search query, and caching the plurality of first search results in a cache storage if the plurality of first search results are the substantially exhausted list of results for the first search query..
08/28/14
20140244584
new patent System and method for implementing cache consistent regional clusters
When multiple regional data clusters are used to store data in a system, maintaining cache consistency across different regions is important for providing a desirable user experience. In one embodiment, there is a master data cluster where all data writes are performed, and the writes are replicated to each of the slave data clusters in the other regions.
08/28/14
20140244581
new patent System and method for log conflict detection and resolution in a data store
A system that implements a data storage service may store data on behalf of storage service clients. The system may maintain data in multiple replicas that are stored on respective computing nodes in the system.
08/28/14
20140244517
new patent Incremental batch method for transforming event-driven metrics and measures within a map/reduce data center
A method for a plurality of processors configured to perform steps in a map/reduce network operation adds incremental batch transformation of sequential measures recorded by time periods and uploaded asynchronously from their capture on mobile devices. The method creates and tracks measure states for each measure.
08/28/14
20140244167
new patent Methods, apparatuses and computer program products for providing a location correction cache
An apparatus for correcting one or more detected locations may include a processor and memory storing executable computer code causing the apparatus to at least perform operations including determining whether a detected location of a communication device was previously designated as an incorrect location. The computer program code may further cause the apparatus to enable provision of an option to a user of the communication device to select a specified correct location corresponding to the incorrect location in response to determining that the detected location was previously designated as the incorrect location.
08/28/14
20140244013
new patent Pre-caching of audio content
Embodiments are provided for causing a playback device to pre-cache audio content in anticipation that a user will provide input to cause the playback device to render the audio content. The playback device may be configured to detect, using a proximity sensor on the playback device, movement in relation to the playback device, responsively retrieve audio content by the playback device from a networked audio source, prior to receiving a user command to play the audio content, storing the audio content in memory on the playback device, and cause the playback device to render the audio content upon receiving the user command.
08/28/14
20140241066
new patent Dual-function read/write cache for programmable non-volatile memory
A non-volatile memory, such as a one-time programmable memory, with a dual purpose read/write cache. The read/write cache is used as a write cache during programming, and stores the data to be written for a full row of the memory array.
08/28/14
20140240517
new patent Monitoring video waveforms
A video signal waveform monitor is shown, which receives an input video signal composed of video lines. A video signal digitizer samples the input video signal at video sample points to generate a sequence of video pixel data, which is written into an acquisition framestore is organized into a video pixel array so as to represent a raster of the input video signal.
08/28/14
20140240335
new patent Cache allocation in a computerized system
System and method for operating a solid state memory containing a memory space. The present invention provides a computerized system that includes a solid state memory having a memory space; a controller adapted to use a first portion of the memory space as a cache; and a garbage collector adapted to use a second portion of the memory space to collect garbage in the solid state memory.
08/21/14
20140237601
Operation of a dual instruction pipe virus co-processor
Circuits and methods are provided for detecting, identifying and/or removing undesired content. According to one embodiment, a content object is stored by a general purpose processor to a system memory.
08/21/14
20140237534
Control plane architecture for multicast cache-fill
A multicast content delivery system can use both multicast and unicast streams to efficiently use available bandwidth to deliver content. Available multicast content can be identified to gateways serving consumption devices, and the gateways can receive requests for unicast content deliver, but honor the requests with multicast group sessions..
08/21/14
20140237321
Solid state drive cache recovery in a clustered storage system
A storage system that includes multiple nodes, each node comprises a ssd cache and a management module and hard disk drives that are coupled to the nodes. The management module of each node is arranged to manage a ssd cache map that comprises multiple entries for storing mappings from logical addresses to ssd cache physical addresses and to physical addresses in the hard disk drives.
08/21/14
20140237224
Network boot system
[solving means] network boot system 100 includes server 10 and terminal 20 having recording device 22a connecting each other through network 30. The terminal includes a read cache mechanism that stores a cache in a read cache region by a read cache driver.
08/21/14
20140237194
Efficient validation of coherency between processor cores and accelerators in computer systems
A method of testing cache coherency in a computer system design allocates different portions of a single cache line for use by accelerators and processors. The different portions of the cache line can have different sizes, and the processors and accelerators can operate in the simulation at different frequencies.
08/21/14
20140237193
Cache window management
A method of managing a plurality of least recently used (lru) queues having entries that correspond to cached data includes ordering a first plurality of entries in a first queue according to a first recency of use of cached data. The first queue corresponds to a first priority.
08/21/14
20140237191
Methods and apparatus for intra-set wear-leveling for memories with limited write endurance
Efficient techniques are described for extending the usable lifetime for memories with limited write endurance. A technique for wear-leveling of caches addresses unbalanced write traffic on cache lines which cause heavily written cache lines to fail much fast than other lines in the cache.
08/21/14
20140237189
Compression status bit cache and backing store
One embodiment of the present invention sets forth a technique for increasing available storage space within compressed blocks of memory attached to data processing chips, without requiring a proportional increase in on-chip compression status bits. A compression status bit cache provides on-chip availability of compression status bits used to determine how many bits are needed to access a potentially compressed block of memory.
08/21/14
20140237187
Adaptive multilevel binning to improve hierarchical caching
A device driver calculates a tile size for a plurality of cache memories in a cache hierarchy. The device driver calculates a storage capacity of a first cache memory.
08/21/14
20140237186
Filtering snoop traffic in a multiprocessor computing system
Filtering snoop traffic in a multiprocessor computing system, each processor in the multiprocessor computing system coupled to a high level cache and a low level cache, the including: receiving a snoop message that identifies an address in shared memory targeted by a write operation; identifying a set in the high level cache that maps to the address in shared memory; determining whether the high level cache includes an entry associated with the address in shared memory; responsive to determining that the high level cache does not include an entry corresponding to the address in shared memory: determining whether the set in the high level cache has been bypassed by an entry in the low level cache; and responsive to determining that the set in the high level cache has not been bypassed by an entry in the low level cache, discarding the snoop message.. .
08/21/14
20140237185
One-cacheable multi-core architecture
Technologies are generally described for methods, systems, and devices effective to implement one-cacheable multi-core architectures. In one example, a multi-core processor that includes a first and second tile may be configured to implement a one-cacheable architecture.
08/21/14
20140237184
System and method for multi-tiered meta-data caching and distribution in a clustered computer environment
A system and method caches and distributes meta-data for one or more data containers stored on a plurality of volumes configured as a striped volume set (svs) and served by a plurality of nodes interconnected as a cluster. The svs comprises one meta-data volume (mdv) configured to store a canonical copy of certain meta-data, including access control lists and directories, associated with all data containers stored on the svs, and one or more data volumes (dv) configured to store, at least, data content of those containers.
08/21/14
20140237183
Systems and methods for intelligent content aware caching
Methods and systems to intelligently cache content in a virtualization environment using virtualization software such as vmware esx or citrix xenserver or microsoft hyperv or redhat kvm or their variants are disclosed. Storage io operations (reads from and writes to disk) are analyzed (or characterized) for their overall value and pinned to cache if their value exceeds a certain defined threshold based on criteria specific to the new technology file system (ntfs) file-system.
08/21/14
20140237182
Method and apparatus for servicing read and write requests using a cache replacement catalog
Methods and systems to intelligently cache content in a virtualization environment using virtualization software such as vmware esx or citrix xenserver or microsoft hyperv or redhat kvm or their variants are disclosed. Storage io operations (reads from and writes to disk) are analyzed (or characterized) for their overall value and pinned to cache if their value exceeds a certain defined threshold based on criteria specific to the new technology file system (ntfs) file-system.
08/21/14
20140237181
Method and apparatus for preparing a cache replacement catalog
Methods and systems to intelligently cache content in a virtualization environment using virtualization software such as vmware esx or citrix xenserver or microsoft hyperv or redhat kvm or their variants are disclosed. Storage io operations (reads from and writes to disk) are analyzed (or characterized) for their overall value and pinned to cache if their value exceeds a certain defined threshold based on criteria specific to the new technology file system (ntfs) file-system.
08/21/14
20140237174
Highly efficient design of storage array utilizing multiple cache lines for use in first and second cache spaces and memory subsystems
A method of operating a cache memory includes the step of storing a set of data in a first space in a cache memory, a set of data associated with a set of tags. A subset of the set of data is stored in a second space in the cache memory, the subset of the set of data associated with a tag of a subset of the set of tags.
08/21/14
20140237171
Solid-state disk with wireless functionality
A system including an interface module to interface a solid-state disk controller to a computing device. A memory control module exchanges data with the computing device via the interface module and caches the data in a solid-state memory controlled by the solid-state disk controller.
08/21/14
20140237163
Reducing writes to solid state drive cache memories of storage controllers
Methods and structure are provided for reducing the number of writes to a cache of a storage controller. One exemplary embodiment includes a storage controller that has a non-volatile flash cache memory, a primary memory that is distinct from the cache memory, and a memory manager.
08/21/14
20140237160
Inter-set wear-leveling for caches with limited write endurance
A cache controller includes a first register that updates after every memory location swap operation on a number of cache sets in a cache memory and resets every n−1 memory location swap operations. N is a number of the cache sets in the cache memory.
08/21/14
20140237157
System and method for providing an address cache for memory map learning
A system for interfacing with a co-processor or input/output device is disclosed. According to one embodiment, the system provides a one-hot address cache comprising a plurality of one-hot addresses and a host interface to a host memory controller of a host system.
08/21/14
20140237147
Systems, methods, and interfaces for adaptive persistence
A storage module may be configured to service i/o requests according to different persistence levels. The persistence level of an i/o request may relate to the storage resource(s) used to service the i/o request, the configuration of the storage resource(s), the storage mode of the resources, and so on.
08/21/14
20140237118
Application aware elephant flow management
A network device manages elephant flows. The network device filters received network data according to an application-specific criteria and identifies the elephant flow from the filtered network data.
08/21/14
20140237081
Method and multimedia content manager for managing multimedia content download
The invention relates to a method, multimedia content manager (mumcm), gateway and user device for managing multimedia content download. The mumcm is in communication with a user device and with an operator network through a subscription.
08/21/14
20140237071
Caching in mobile networks
There is described a method for optimising the distribution of data objects between caches in a cache domain of a resource limited network. User requests for data objects are received at caches in the cache domain.
08/21/14
20140237068
Method, system and server of removing a distributed caching object
The present disclosure discloses a method, a system and a server of removing a distributed caching object. In one embodiment, the method receives a removal request, where the removal request includes an identifier of an object.
08/21/14
20140237067
Connection cache method and system
A method, apparatus and computer program product for maintaining a connection cache at an intermediate server, wherein the connection cache relating to resource requests from a plurality of devices to a plurality of servers remote therefrom. The method comprises monitoring resource requests addressed to a plurality of said remote servers during a first time period; generating statistics data on the basis of the monitored resource requests; establishing a plurality of connections from the intermediate server to a subset of the plurality of remote servers, said subset being determined on the basis of the generated statistics data; and storing data indicative of the plurality of established connections in a connection cache.
08/21/14
20140237033
Method, device and mobile terminal for controlling interface display
A method of displaying user interfaces on an electronic device comprising: detecting a first user input to access an application by a first user, including identifying information of the first user. In response to detecting the first user input, determining whether there is confidential data associated with the application cached in memory.


Popular terms: [SEARCH]



Follow us on Twitter
twitter icon@FreshPatents

###

This listing is a sample listing of patent applications related to Cache for is only meant as a recent sample of applications filed, not a comprehensive history. There may be associated servicemarks and trademarks related to these patents. Please check with patent attorney if you need further assistance or plan to use for business purposes. This patent data is also published to the public by the USPTO and available for free on their website. Note that there may be alternative spellings for Cache with additional patents listed. Browse our RSS directory or Search for other possible listings.
     SHARE
  
         


FreshNews promo



0.5392

3752

71 - 0 - 74