processing

Results 26 - 50 of 495Sort Results By: Published Date | Title | Company Name
Published By: HERE Technologies     Published Date: Mar 26, 2019
The retail sector has been expanding with each passing decade thanks to intelligent marketing, intense research around customer behavior, creative advertising, and the adoption of new technology. Now industry movers and shakers are looking to boost sales with location intelligence. With location intelligence, retail companies can track and predict consumer trends and shifts in demand. With these insights companies can capitalize on growth opportunities in new product or service areas. HERE Open Location Platform provides powerful, easy-to-use developer tools, standardized technology and scalable infrastructure to simplify the processing of location-based data. Learn how to gain a competitive edge with HERE Technologies.
Tags : 
location data, mapping, here technologies
    
HERE Technologies
Published By: Cisco EMEA     Published Date: Nov 13, 2017
Big data and analytics is a rapidly expanding field of information technology. Big data incorporates technologies and practices designed to support the collection, storage, and management of a wide variety of data types that are produced at ever increasing rates. Analytics combine statistics, machine learning, and data preprocessing in order to extract valuable information and insights from big data.
Tags : 
big data, analytics, virtualization, cloudera, ibm, sas, sap, splunk
    
Cisco EMEA
Published By: SAP     Published Date: May 18, 2014
In-memory technology—in which entire datasets are pre-loaded into a computer’s random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management, business insights, architecture, business intelligence, big data tools
    
SAP
Published By: Coupa     Published Date: Dec 14, 2018
Data from The Hackett Group’s most recent Purchase-to-Pay Performance Study shows that organizations with high levels of AP automation save 43% on invoice processing costs. However, top-performing organizations don’t focus solely on process automation.
Tags : 
    
Coupa
Published By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
In midsize and large organizations, critical business processing continues to depend on relational databases including Microsoft® SQL Server. While new tools like Hadoop help businesses analyze oceans of Big Data, conventional relational-database management systems (RDBMS) remain the backbone for online transaction processing (OLTP), online analytic processing (OLAP), and mixed OLTP/OLAP workloads.
Tags : 
database usage, database management, server usage, data protection
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
What if you could reduce the cost of running Oracle databases and improve database performance at the same time? What would it mean to your enterprise and your IT operations? Oracle databases play a critical role in many enterprises. They’re the engines that drive critical online transaction (OLTP) and online analytical (OLAP) processing applications, the lifeblood of the business. These databases also create a unique challenge for IT leaders charged with improving productivity and driving new revenue opportunities while simultaneously reducing costs.
Tags : 
cost reduction, oracle database, it operation, online transaction, online analytics
    
Hewlett Packard Enterprise
Published By: Cisco EMEA     Published Date: Mar 26, 2019
Imagine if you could see deep into the future. And way back into the past, both at the same time. Imagine having visibility of everything that had ever happened and everything that was ever going to happen, everywhere, all at once. And then imagine processing power strong enough to make sense of all this data in every language and in every dimension. Unless you’ve achieved that digital data nirvana (and you haven’t told the rest of us), you’re going to have some unknowns in your world. In the world of security, unknown threats exist outside the enterprise in the form of malicious actors, state-sponsored attacks and malware that moves fast and destroys everything it touches. The unknown exists inside the enterprise in the form of insider threat from rogue employees or careless contractors – which was deemed by 24% of our survey respondents to pose the most serious risk to their organizations. The unknown exists in the form of new devices, new cloud applications, and new data. The unk
Tags : 
    
Cisco EMEA
Published By: Cisco     Published Date: Jun 21, 2016
The Cisco® Hyperlocation Solution is the industry’s first Wi-Fi network-based location system that can help businesses and users pinpoint a user’s location to within one to three meters, depending on the deployment. Combining innovative RF antenna and module design, faster and more frequent data processing, and a powerful platform for customer engagement, it can help businesses create more personalized and profitable customer experiences.
Tags : 
    
Cisco
Published By: Pentaho     Published Date: Feb 26, 2015
This eBook from O’Reilly Media will help you navigate the diverse and fast-changing landscape of technologies for processing and storing data (NoSQL, big data, MapReduce, etc).
Tags : 
data systems, data-intensive applications, scalability, maintainability, data storage, application development
    
Pentaho
Published By: OpenText     Published Date: Mar 02, 2017
Watch the video to learn how Procure-to-Pay (P2P) solutions automate B2B processes to help you gain better visibility into transaction lifecycles, improve efficiency, and increase the speed and accuracy of order, shipping, and invoice processing.
Tags : 
supply chain, b2b, procure-to-pay, shipping, invoicing
    
OpenText
Published By: AWS     Published Date: Sep 05, 2018
Big data alone does not guarantee better business decisions. Often that data needs to be moved and transformed so Insight Platforms can discern useful business intelligence. To deliver those results faster than traditional Extract, Transform, and Load (ETL) technologies, use Matillion ETL for Amazon Redshift. This cloud- native ETL/ELT offering, built specifically for Amazon Redshift, simplifies the process of loading and transforming data and can help reduce your development time. This white paper will focus on approaches that can help you maximize your investment in Amazon Redshift. Learn how the scalable, cloud- native architecture and fast, secure integrations can benefit your organization, and discover ways this cost- effective solution is designed with cloud computing in mind. In addition, we will explore how Matillion ETL and Amazon Redshift make it possible for you to automate data transformation directly in the data warehouse to deliver analytics and business intelligence (BI
Tags : 
    
AWS
Published By: AWS     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
    
AWS
Published By: Oracle CX     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle CX
Published By: Infosys     Published Date: May 30, 2018
Customers today are far more concerned about the contents and origin of a product than ever before. in such a scenario, granting them easy access to product information, via digital initiatives such as SmartLabel™, goes a long way in strengthening customer trust in a brand. But it also means expending several man-hours of effort processing unstructured data, with the possibility of human error. Intelligent automation can help save effort and time, with virtually error-free results. A consumer products conglomerate wanted a smart solution to implement SmartLabel™ compliance. See how Infosys helped and the five key takeaways from the project.
Tags : 
automation, brand, information, digital, customer
    
Infosys
Published By: IBM APAC     Published Date: Apr 27, 2018
While relying on x86 servers and Oracle databases to support their stock trading systems, processing rapidly increasing number of transactions fast became a huge challenge for Wanlian Securities. They shifted to IBM FlashSystem that helped them cut average response time for their Oracle Databases from 10 to less than 0.4 milliseconds and improved CPU usage by 15%. Download this case study now.
Tags : 
    
IBM APAC
Published By: Dome9     Published Date: Apr 25, 2018
As of May 2017, according to a report from The Depository Trust & Clearing Corporation (DTCC), which provides financial transaction and data processing services for the global financial industry, cloud computing has reached a tipping point1. Today, financial services companies can benefit from the capabilities and cost efficiencies of the cloud. In October of 2016, the Federal Deposit Insurance Corporation (FDIC), the Office of the Comptroller of Currency (OCC) and the Federal Reserve Board (FRB) jointly announced enhanced cyber risk management standards for financial institutions in an Advanced Notice of Proposed Rulemaking (ANPR)2. These proposed standards for enhanced cybersecurity are aimed at protecting the entire financial system, not just the institution. To meet these new standards, financial institutions will require the right cloud-based network security platform for comprehensive security management, verifiable compliance and governance and active protection of customer data
Tags : 
    
Dome9
Published By: KPMG     Published Date: Nov 21, 2018
Many BPO providers have already invested heavily in robotic process automation (RPA), providing an opportunity to renegotiate your BPO contract to reflect the cost savings they are making as a result. But there’s more. Contract renegotiation also creates an opportunity to leverage providers’ investment to accelerate automation in your own business. Read this to discover: • the benefits of automating business processes and the impressive level of cost savings this can deliver • the impact of automation on the BPO market and the implications this has for BPO contracts • a five-step processing for assessing and renegotiating your outsourcing contract • three options for starting your automation journey.
Tags : 
    
KPMG
Published By: Oracle     Published Date: Aug 09, 2018
The purpose of IT backup and recovery systems is to avoid data loss and recover quickly, thereby minimizing downtime costs. Traditional storage-centric data protection architectures such as Purpose Built Backup Appliances (PBBAs), and the conventional backup and restore processing supporting them, are prone to failure on recovery. This is because the processes, both automated and manual, are too numerous, too complex, and too difficult to test adequately. In turn this leads to unacceptable levels of failure for today’s mission critical applications, and a poor foundation for digital transformation initiatives. Governments are taking notice. Heightened regulatory compliance requirements have implications for data recovery processes and are an unwelcome but timely catalyst for companies to get their recovery houses in order. Onerous malware, such as ransomware and other cyber attacks increase the imperative for organizations to have highly granular recovery mechanisms in place that allow
Tags : 
    
Oracle
Published By: Automation Anywhere     Published Date: Feb 21, 2019
Automation Anywhere’s flagship product is Automation Anywhere Enterprise – a RPA platform offering a variety of tools to help organisations develop, operate and manage RPA bots that automate data entry, data gathering and other repetitive, routine tasks usually carried out as part of high-volume, repetitive work (for example, service fulfilment work in call centres, shared-service centres, and back-office processing environments). Automation Anywhere Enterprise bots can add value both in unattended (server-based, lights-out operation) and attended (desktop-based, interactive) deployment configurations. In this report, MWD Advisors digs deeper into the features and capabilities of Automation Anywhere’s product portfolio, analysing its fast-growth trajectory and highlighting large-scale implementations.
Tags : 
    
Automation Anywhere
Published By: IBM APAC     Published Date: Nov 22, 2017
Using IBM Watson’s cognitive capabilities, companies can quickly differentiate their customer service quality by being more pro active and responsive to customer needs. Simply put, chatbots and virtual agents are the future of customer interactions. Building apps from scratch that incorporate natural language processing, speech to text recognition, visual recognition, analytics, and artificial intelligence requires broad expertise in these disciplines, large staffs, and a huge financial commitment. Making use of IBM Watson cognitive services brings these capabilities in-house quickly and without the capital investment that would be needed to develop the technologies within an organization.
Tags : 
decision making, deeper data, insights, cognitive, analytics, ibm, watson, virtual agents
    
IBM APAC
Published By: IBM APAC     Published Date: Nov 22, 2017
AlchemyAPI’s approach to natural language processing incorporates both linguistic and statistical analysis techniques into a single unified system. This hybrid approach provides an industry-leading advantage since both techniques have benefits and drawbacks depending on the content and specific usecases. Linguistic analysis takes a basic grammatical approach to understand how words combine into phrases, and how those phrases combine into sentences. While this approach works well with editorialized text (e.g., news articles and press releases), it does not perform as well when it comes to usergenerated content, often filled with slang, misspellings and idioms. Statistical analysis, however, understands language from a mathematical standpoint and works well on “noisy” content (e.g., tweets, blog posts, and Facebook status updates). The combination of these two approaches allows for increased accuracy on a variety of content.
Tags : 
industry, advantage, linguistic, grammatical, statistical analysis, content
    
IBM APAC
Published By: Digital Realty     Published Date: Feb 25, 2015
When measuring competitive differentiation in milliseconds, connectivity is a key component for any financial services company’s data center strategy. In planning the move of its primary data center, a large international futures and commodities trading company needed to find a provider that could deliver the high capacity connectivity it required.
Tags : 
financial services, trade processing, data center, connectivity, data center
    
Digital Realty
Published By: Dell EMC     Published Date: Nov 02, 2015
Today’s IT environment is more complex than ever. The applications that have become critical to business operations require greater processing power, memory, and storage. This guide prepares decision-makers to choose servers that meet their current needs, while building a flexible, reliable, scalable infrastructure to handle future requirements.
Tags : 
    
Dell EMC
Published By: Dell EMC     Published Date: Nov 09, 2015
This business-oriented white paper summarizes the wide-ranging benefits of the Hadoop platform, highlights common data processing use cases and explores examples of specific use cases in vertical industries. The information presented here draws on the collective experiences of three leaders in the use of Hadoop technologies—Dell and its partners Cloudera and Intel.
Tags : 
    
Dell EMC
Published By: Dell EMC     Published Date: Oct 08, 2015
Big data can be observed, in a real sense, by computers processing it and often by humans reviewing visualizations created from it. In the past, humans had to reduce the data, often using techniques of statistical sampling, to be able to make sense of it. Now, new big data processing techniques will help us make sense of it without traditional reduction.
Tags : 
    
Dell EMC
Start   Previous    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.


Featured FREE Resource: