The theft of network cards from a Verizon office last May caused many Manhattan businesses to lose Internet service for nearly a day. Outages such as this convey to all businesses—not just the affected ones—how dependent they have become on Internet access and how vulnerable they are to disruptions. A few hours of downtime—even outside normal business hours—could mean thousands of dollars in missed opportunities and lost customers. But how can you ensure you’ll have an unbroken connection?
A growing number of professionals, as well as companies of all sizes, are buying redundant connectivity: multiple paths to the Internet. If cable modem service, DSL, and/or wireless broadband are available in your area, you may want to subscribe to at least two of these services. (Few, if any, ISPs provide connections “for backup only.” While some will let you use a conventional modem if your connection fails, this is so slow that it is of limited use as a backup.)
But how do you configure your equipment to shift quickly and seamlessly to another connection? Most computers and SOHO routers cannot do this, so unless you have the right hardware or software, you have to reconfigure manually.
What’s more, if you’re paying for multiple connections, you should be able to use the extra bandwidth in nonemergency situations. You should be able to exploit all your links simultaneously (load sharing) to get more bandwidth than you could from any one of them. There should also be a way to shift loads when one of your connections is slowed through congestion. You may even want to steer users accessing your servers to the “closest” connection, Internet-wise, to them. But how do you do all this? You’re not likely to find any help in the documentation for your operating system or small router/firewall.
Unfortunately, it’s hard for a small company or an individual to exploit the protocols that implement load sharing and redundancy on the Internet. For example, the Border Gateway Protocol, or BGP (RFC 1771), lets a host or a network “advertise” to the Internet how it can be reached—and to change that information if a connection goes down. But because the BGP assumes you’re a large entity that has a big network and its own block of Internet addresses, using BGP to help with redundancy isn’t practical even for moderate-sized companies, let alone SOHO users.
The Open Shortest Path First, or OSPF (RFC 2328) protocol, is good at managing redundant connections inside a network that you control. But OSPF won’t help you manage multiple ISP connections unless they’re all to the same ISP, which has agreed to help you use OSPF for this. (Most ISPs won’t do that.)
If you’re a typical small-business user, you’ll need to find a way to get these features without the help of your ISP(s) or special protocols. Fortunately, engineers have begun to cook up hardware and software to let you do this.
The tricks used by these multihoming appliances can be divided into two categories: those for incoming connections (such as hits on your Web server) and those for outgoing traffic (your own browsing). The best-known are the ones that handle incoming traffic. Nearly all such products can create a server pool, which divides incoming requests among your servers and ensures that a client is matched to the same server throughout a Web transaction.
Most server appliances use domain name service (DNS) to route incoming traffic over a particular link. As most URLs include a domain name, it’s possible to control which link a client uses to reach your server by changing the IP address that’s returned when the client looks up a domain name.
These appliances have built-in DNS servers that respond to changing conditions. First, the appliance performs a periodic check on each Internet connection and stops furnishing IP addresses associated with a connection that goes down. (The standard protocols that domain name servers use already let them have multiple IP addresses, so a client automatically tries to reach the DNS server via a different link if one is down.)
If multiple links are up, the DNS server may choose the IP address to return to a particular client by considering the loading and capacity of each of your links. It may also try to do a proximity test to choose the link that leads most directly to the client. Since the appliance can’t know how much data each client will request, it can’t always balance loads optimally, but it should at least let you take some advantage of all of your working links. Once the traffic gets inside your network, the device may also use a specialized form of Network Address Translation (NAT)—sometimes called smart NAT—to pair each client with a particular server, dividing the load so that none of your servers is saturated.
Even if your office doesn’t host anything that you think of as a server—for example, your Web site—server redundancy features are still important in the event of an outage. Why? Because many of the applications you may use every day—VoIP, peer-to-peer, remote-control programs (like LapLink), VPN, file-sharing programs, and more—cause your system to accept data connections from the outside world and thus act like a server. If machines in the outside world can’t “find” you, you may lose functionality, so any solution that provides good redundancy must divert all incoming traffic (usually via DNS) to the connection(s) that are still working.
Load balancing and redundancy for clients connecting outward to the Internet are relatively new areas, for which far fewer products are available, and techniques vary widely. Some split up outgoing traffic packet by packet, feeding each to the least congested upstream link. This technique is especially useful because many Internet connections are asymmetrical—with less upstream bandwidth than downstream. Unfortunately, allocating bandwidth packet by packet between multiple connections doesn’t always work. To help thwart Internet worms, some ISPs use egress filtering: They won’t let you send packets with a return address that belongs to a different ISP. So, a load balancer that wants to spread a connection among multiple physical links needs to be able to sense—or be told—that a particular link can be used to send only packets with certain source addresses.
Other appliances pick one link to use exclusively for each session, or connection, with a remote server. However, the router may not always correctly determine the best link to use. (Some products try to start every outgoing connection on several links simultaneously; the one that responds first is used, even if a random delay skewed the outcome.) Also, not all products can correctly balance loads on an asymmetrical Internet connection, such as ADSL. Finally, the appliance must be able to check the health of each link.
Source: PCMAG.com
IBM and Nuance, a speech-recognition products vendor, have announced a five-year research agreement to explore ways for the health care industry to tap into the capabilities of IBM’s Watson supercomputer, InformationWeek reports.
The two companies are collaborating with Columbia University Medical Center and the University of Maryland School of Medicine (McGee, InformationWeek, 2/17).
Watson Details
IBM researchers developed the Watson supercomputer over the past four years.
The machine — which had a successful run as a contestant on the television trivia show Jeopardy! — is powered by 90 servers and 360 computer chips (Tibken, Dow Jones/Wall Street Journal, 2/16).
Health Care Implications
IBM engineers, along with Columbia and University of Maryland researchers, are seeking to identify how Watson could work with health care professionals (Gaudin, Computerworld, 2/17).
Researchers will use Nuance technology to develop a “physician assistant” that could mine data from health care providers’ existing electronic health record systems, medical images and dictated reports to help guide physicians’ decisions. Data from other sources — such as best practices and other evidence-based literature — also could be factored in to the technology (InformationWeek, 2/17).
Eliot Siege — a professor and vice chair at Maryland’s department of diagnostic radiology — said, “Having a computer understand and present the information to me is a huge step towards allowing me to make a better diagnosis” (Castillo, Time, 2/17).
IBM and Nuance expect to bring the first Watson health-related products to market within 24 months (Denison, Boston Globe, 2/17).
Caution Urged
Some stakeholders are saying expectations about Watson’s applicability to health care should be tempered.
Peter Schulman — a cardiologist at the University of Connecticut Health Center — said technology vendors are quick to tout their products as ways to improve health care, but many do not fulfill expectations.
Joseph Dell’Orfano — a cardiologist at Connecticut-based St. Francis Hospital and Medical Center — said that computers are “just tools, ones that we need to use properly” (Weir, Hartford Courant, 2/15).
Katherine Frase — vice president of industry solutions at IBM Research — said that a computer would not be able to determine if a patient is lying, as a physician could. She added, “I don’t think that any machine is ever going to take the place of the decision making process of the human” (Time, 2/17).
Source: iHealthBeat
HHS has launched a Web portal that allows health IT developers to access health data in an effort to bring about more health care innovations, Modern Healthcare reports.
HHS Secretary Kathleen Sebelius said that the Health Indicators Warehouse portal “provides a new public resource needed to fuel development of innovative IT applications” to improve health care.
Portal Details
Currently, the warehouse contains nearly 1,200 health indicators derived from 170 data sources (Conn, Modern Healthcare, 2/13). Health indicators are defined as characteristics that describe the health of a population. Such characteristics include:
The Web portal is part of HHS’ open-government initiative, which is designed to make federal operations more transparent (Mosquera, Government Health IT, 2/14).
Source: iHealthBeat
An increasing number of medical professionals are turning to voice-recognition software, the Boston Globe reports.
Reason for Health Care Industry Adoption
Voice-recognition technology is popular among physicians because they are trained to dictate their notes. The software eliminates the need for transcribers by directly adding physicians’ notes to electronic health records.
The federal government’s push to promote EHR use has contributed to the increase in adoption of voice-recognition software.
In addition, voice-recognition technology has improved in functionality, speed and efficiency due to faster computer chip sets and software upgrades.
According to the Globe, future software upgrades could further expand the use of voice-recognition technology in the medical field (Denison, Boston Globe, 2/14).
Source: iHealthBeat
On Wednesday, HHS announced that it will make $750 million in federal funds available for disease prevention efforts, including data collection initiatives and IT infrastructure projects at local health departments, HealthLeaders Media reports (Clark, HealthLeaders Media, 2/10).
The money, which is in addition to $500 million that HHS allocated last year, will come from the Prevention and Public Health Fund created by the federal health reform law.
Specific projects include $137 million to bolster public health infrastructure by helping state and local health departments invest in new technology and staff training, and $133 million to collect and present data on the effects of the health reform law (Zigmond, Modern Healthcare, 2/9).
In addition:
HHS Secretary Kathleen Sebelius said, “This investment is going to build on the prevention work already under way to help make sure that we are working effectively across the federal government, as well as with private groups and state and local governments to help Americans live longer, healthier lives” (HealthLeaders Media, 2/10).
Source: iHealthBeat
Most parents prefer electronic forms over paper forms when filling out medical information about their children, according to a study published in the Journal of Medical Internet Research, the Canadian Press/Winnipeg Free Press reports (Canadian Press/Winnipeg Free Press, 2/7).
Study Details
For the study, researchers at Children’s Hospital Boston and Hospital for Sick Children in Toronto followed 180 parents of children who were being treated for attention deficit hyperactivity disorder in the Boston area (Abma, Postmedia News/Calgary Herald, 2/7).
Parents were asked to fill out traditional paper forms and electronic forms that asked about their child’s behavior, prescription medications and side effects to medication.
Although the two types of forms covered the same content, the electronic applications included:
Findings
According to the study, parents generally reported feeling less burdened when filling out the electronic forms, compared with the paper forms. Parents also said that the electronic forms were easier to complete and that they would be more likely to fill out such forms in the future (Postmedia News/Calgary Herald, 2/7).
Stephen Porter — lead author of the study and head of emergency medicine at the Toronto hospital — said the findings have important implications for efforts to design personal health record for pediatric chronic conditions (Canadian Press/Winnipeg Free Press, 2/7).
Source: iHealthBeat
For assistance or questions regarding online forms and how they would be used within your practice, don’t hesitate to contact us.
Virtual reality systems increasingly are being integrated into a wide variety of health-related settings, such as clinical IT systems, operating rooms, medical schools and treatment programs for returning soldiers, according to a new report from Kalorama Information, Healthcare IT News reports.
The report found that between 2006 and 2010, the U.S. market for health-related virtual reality applications experienced a compound annual growth rate of more than 10%, reaching about $670 million in sales in 2010.
Researchers predicted that the market growth will continue at a greater rate through 2015 as health care providers begin ratcheting up their equipment and technology spending.
In addition, the report noted that health-related virtual reality products are starting to gain new attention from health care industry suppliers, partly because recent merger and acquisition deals have opened up new opportunities in the health care virtual reality market (Monegain, Healthcare IT News, 2/7).
Source: iHealthBeat
The survey looked to gauge the opinions of the public and physicians about the deployment of health IT. Nearly three-quarters of participating physicians said they would like to be able to exchange patient information electronically. Additionally, about 80% of patients and physicians said data-sharing requirements were important for coordinating care and reducing the likelihood of medical errors.
“By the same overwhelming margin, four in five doctors and patients expressed the importance of privacy protections for online medical records, an expectation we have repeatedly found on the part of the public in our previous surveys,” said Dr. Carol Diamond, Markle’s managing director, in the release. “This survey is a powerful indication that the public and physicians alike want investments in health IT to come with accountability.”
Source: ModernHealthcare.com
More than half of IT managers at health care delivery organizations do not protect patient data used for software development and testing, according to a survey released Tuesday, InformationWeek reports.
The report, conducted by the Ponemon Institute and sponsored by Informatica, is based on interviews with 450 IT professionals working at U.S. health care organizations.
Key Findings
According to the report, 74% of IT managers believe that meeting privacy and data protection requirements is important, but only 35% believe their company successfully achieves these goals.
The study also found that:
Researchers also asked IT managers about the consequences of data breaches and found that:
Recommendations
To improve data protection during software development, the Ponemon Institute recommends that health care organizations:
Source: iHealthBeat
Need assistance with protecting your patient data… give us a call!
All businesses, whether small or large, operate on a day to day basis looking to cut costs, increase production and gain profit. One undeniable necessity for businesses of today remains to be computers and technology. Can you name a business that does not use computers as part of it’s day to day operations…besides the “Paleta guy” on the corner? NO YOU CAN’T!!
Let’s focus on small businesses today; companies that have fewer than 50 employees, and a range of 1-30 computers (or servers, or workstations). Due to the nature of computers and their likeliness to fail, many business owners have developed the “break/fix” mentality. What is the “break/fix mentality” you ask?
In other words many business managers and owners, feel that there is no need to worry about the maintenance of their computers or data until something “breaks” or fails to work properly. Unfortunately, there is a serious problem brewing, for most business owners and managers, who are not pro-active about maintaining their technology.
Not applicable to all business types, businesses that provide services to clients and retain sensitive information such as credit card numbers, social security numbers, mortgage records, health information etc, can not afford to cut corners when it comes to their technology, network, and sensitive data.
Let’s visit a scary fact: According to the Institute for Business and Home Safety, an estimated 25% of businesses do NOT reopen following a major IT disaster.
We all understand that statistics can be skewed, but realistically speaking; does YOUR business serve customers that would lose trust in your security if they received notification that their personal information stored on your company’s computer(s), had been compromised or stolen; or even worse – used in identity theft?
NO, they wouldn’t!
Let’s envision a different scenario, that is actually common in the business world. The owner of an accounting company decides to streamline his budget and call his “neighbor’s son” to address his computer network when there is a problem. One day, a hard drive failed on their file server, and the company’s financial records stored on that server/drive, were not longer accessible. “No problem” said the business owner.. “we have backup of this drive so we are fine.” Unfortunately, the backup was never tested, and the “neighbor’s son” soon realizes that the backup copy that they have been using – does NOT restore. The critical financial data is lost for good!
To make matters worse, an audit of the company’s books by the shareholders – is scheduled for the next week. The business owner mentions that the financial records have been lost. Several of the share holders allege that the owner has “cooked the books” and is simply hiding the data from them. The shareholders decide to sue in court, and are awarded a large sum of money. The business owner files bankruptcy and goes out of business. Who ever thought backing up data was THAT important?
Managed Services is a proven solution to many of the current technology problems that plague small business owners, and can cause their businesses to close their doors unexpectedly. Also known as Pro-Active Maintenance, Maintenance Plans, Maintenance Agreements, Contract Business Tech Support, or Network Tech Support….the list goes on.
In a nutshell, this service is provided by IT service providers or computer/network consultants, to keep networks, computers, servers and hardware working in optimal condition as well as preventing problems that “regular computer users” (most business owners), would fail to notice.
Just the same as an automobile requires maintenance to avoid catastrophic engine failure (which in many cases the repairs cost more than the car is valued), so do networks and hardware. Some of the benefits that a maintenance plan can offer are:
* 24/7/365 monitoring the health of servers, workstations, computers and hardware; to discover and mitigate possible problems that could ruin a business.
* Managing (adding/removing) staff/employees on a network, preventing people without valid credentials, from gaining access to confidential and privileged information.
* Assuring that the anti-virus software installed on servers and workstations, updates properly and contains the latest malware detection signatures.
* Monitoring internet traffic to detect hackers attempting to gain unauthorized access to the network; in hopes of using the company’s network to attack other businesses – while concealing their true identity and location. Also, monitoring traffic to detect hackers that will compromise a company’s network for illegal financial gain (credit card info theft, etc).
* Implementation of backup solutions. Ensuring that all data is being backed up to prevent data loss, as well as testing the backup for integrity, to ensure the backup can actually be restored successfully if needed.
* Implementation of power supply solutions. Ensuring that in case of a major power outage, all computers, servers, printers etc, will continue to stay powered on to allow the users to save all of their current projects, with no loss in productivity, or loss data.
* Network security implementation. Ensuring that a policy for network and internet access is put into place. Wireless networks are secured to prevent unauthorized users from “stealing” critical data, or connecting to the network to share it’s internet connection. As well, putting into place a policy that will not allow certain users to jeopardize the integrity of a network (i.e. connecting to the company’s wired network using a wireless router and a laptop, thus exposing sensitive information to the “outside world,” due to an unsecured connection).
Source: ezinearticles
Copyright 2015 - Pulse Practice Solutions | 615.425.2719