Why Backup Data and How to Backup on Windows – A Simple Solution

WHY BACKUP DATA?

The most precious thing on your computer is the data. No, not the latest and greatest computer you have purchased recently. The most valuable asset on your computer is all the data stored on it, especially what you have created, collected, processed, organized for your work, learning, or even entertainment. Now here are some cold hard truth about data and the associated risks.

1. More than 95% computer users have experienced data loss in some point of their life.

2. Data loss cost US businesses $11.8 billion in 1998. 6% of all PCs will suffer an episode of data loss in any given year.
Reference: http://gbr.pepperdine.edu/2010/08/the-cost-of-lost-data/

3. An article from Boston Computing, provides the following statistics:

      • 30% of all businesses that have a major fire go out of business within a year. 70% fail within five years.
      • 93% of companies that lost their data center for 10 days or more due to a disaster filed for bankruptcy within one year of the disaster.
      • 31% of PC users have lost all of their files due to events beyond their control.
      • Every week 140,000 hard drives crash in the United States.
      • A 2013 poll found that 30% of computer users had NEVER backed up their data.
      • According to a 2013 report, 55% of disaster-related downtime stems from hardware failure, 22% from human error, and 18% from software failure.
      • More than half (53%) of the SMB organizations surveyed revealed they do not conduct daily backups.
      • According to a survey of SMB organizations, 32% responded saying backup is not an efficient use of their time.
      • Simple drive recovery can cost between US$5,000-10,000 and still success is not guaranteed.

4. Refer to a USA Today 2006 News article.

5. Refer to an Information Week 2011 News article.

6. For many businesses, data is their lifeline. Data loss can make them to go out of business.

7. Accidental delete, virus attack, hacking, disk failure are some of the common reasons for data loss.

8. Fire, earthquake, storm, flood are some of the natural calamities which result into data loss.

As it shows above, data loss has been known and documented to be a major problem for long time — for about 20 years (definitely a lot more than that). The volume of problems has increased with growing ubiquity of computers among households and businesses. The widespread use of mobile devices has made the risk potential even higher. Add to that the recent rise in malware attacks and cyber-crimes, e.g., WannaCry, Petya, Bad Rabbit etc. It seems there is no end in sight for security breaches, information leaks, and data loss / damage. All this is scary. So, how to address such risks to your precious data? The best option is to practice “Safe Computing” (outlined below), and also take regular backups, which is a simple defense against data loss due any reason.

Here are a few basic points related to backup:
a) In case of a data loss, recovery of an earlier version (that is close to the latest) is the main reason for backup.
b) “Sync” vs. “Backup”: Sync is convenient and popular, but Backup is more robust, dependable, and legally compliant.
c) Network Attached Storage (NAS), especially with RAID, is a common and good backup medium.
d) Regular backup is a legal requirement for businesses in some countries.
e) Backup, preferably automated / scheduled, should be part of every business’s Disaster Recovery plan.

Backup is in general of two types — full and incremental.

  • Full backup saves all the files and folders selected for the backup. It generates a bigger archive and takes longer, but has the full data snapshot.
  • Incremental backup saves only the files that have changed or are new since the last backup (full or incremental). It’s smaller in size and less time to complete, but it depends on the last full backup and any incremental backups in between.

It is a good practice to take a full backup after a significant number of incremental backups. E.g., if incremental backup is taken every day, a full backup can be taken once every month.

Earlier restoration involved restoring the full backup, followed by differential backup, and then all the incremental backups – a tedious and time-consuming process indeed! Now all that can be achieved in a single step, such as “Structure Restore”.

Here are some trends / observations for the data management and backup area:

  • With the prevalence of offsite / cloud-based email services, on-site email servers are reducing and email backup is becoming less important.
  • Permission issues and PC unavailability are the biggest reasons for backup failure.
  • Automatic data backup is much better than manual backups as the latter takes extra time and can be forgotten or skipped under pressure.
  • A well-defined data retention policy is as important as the backup policy and arrangement.
  • In general a business should have a different backup and data retention policy for each of its departments.

For the sake of your business, it’s a must to follow the Safe Computing Principles

  1. Use legal and up-to-date Operating System and other software
  2. Use a reputed and current anti-virus / anti-malware software
  3. Install the latest released stable updates for all your software, especially the Operating System
  4. Use automatic / scheduled regular backups of all important data
  5. Be paranoid… OK, just be super careful — do not trust random email attachments, links in emails, unknown websites / tiny URLs etc.

Please refer to Ransomware (in Wikipedia), in particular the “Mitigation” section.

Our two other posts related to this may also be informative and helpful:

    1. How to Handle Ransomware Threat: Be Cautious and Backup Data
    2. How to Create a Folder Accessible to only a Specific User (Data Backup User)

HOW TO BACKUP DATA ON WINDOWS?

SARANGSoft offers two software products for data backup on Windows:

    1. filexpertez for backup of individual Windows PCs and servers
    2. WinBackup Business for backup of all Windows PCs and servers in a Windows network (domain or workgroup)

SARANGSoft filexpertez is one of the first applications to
a) provide “Structure (one-click) Restore”, reducing the restore effort by up to 99%.
b) allow versioning of the backed up files.
c) enable backup to cloud as well as various local storage.
With the use of “Structure Restore”, the need for differential backup is almost eliminated.

SARANGSoft WinBackup Business performs decentralized backup with centralized control — providing best of both worlds.
a) The backup effort gets distributed across all the computers and is not imposed on a single computer. This results in more flexibility as well as faster backups.
b) The network remains clog-free as the files are not examined or transferred over the network.
c) Enables configuring multiple backups – use different backup specification for different computers or even for the same computer.
d) (Unlike many other backup products) Doesn’t force the users to put their files (to backup) in a single folder. Multiple folders on a user’s computer can be specified to backup.
e) No need to have a single folder-set apply to all computers. Specify a different folder-set to backup for every computer.
f) It’s a rare backup software that provides both Individual (flexible) and Common (easy to use) Selection of folders to backup — see more on that below.
g) Specify a different backup destination for each backup specification. It helps categorize the destination depending on backup contents or the department.
h) Get access to a number of important reports. Click on individual column headers within the reports to sort the data, toggle between ascending and descending order the same way.
i) To backup data locally (not accessible even to the administrator), set the backup destination to local storage media such as External USB Hard Disk / CD / DVD.

Individual Selection vs. Common Selection

Individual Selection
Specify a different folder-set for every computer to backup; not necessarily the same folder-set to backup for every computer.
Common Selection
(If Individual Selection is tedious) Specify a common folder-set for all the computers to backup.

NOTE: If there are permission issues in the network and you are unable to perform Individual Selection, you can still go for Common Selection of the folders to backup.
Individual Selection offers you flexibility, whereas Common Selection offers you ease of use. It’s you who decides what you want.

It is important to remember a few points about Backup Scheduling:

  • Use the same User Id to launch the “Admin Console” and configure backups as what was specified while installing WinBackup Business Server.
  • Scheduling may fail due to permission issues or unavailability of the remote agent computer. Use the ‘Failed Schedules’ report to retry scheduling at a more suitable time.
  • To re-schedule an existing backup, load it in ‘Manage Backup’, go to the schedule options page, modify the schedule, and click ‘Start’ on the Progress page.

CloudScape‘, bundled with WinBackup Business, helps upload backup archives to public cloud storage ‘Amazon AWS-S3’ and ‘Microsoft Azure’ for added resiliency and protection.
NOTE: The AWS and Azure accounts must be your own. SARANGSoft does NOT provide such accounts along with its backup products.

If you feel lost at any point, or do not find what you need, or for any other problem, use the context sensitive help on any topic by just pressing the ‘F1’ key on the keyboard or the ‘Help’ button on the toolbar.

Network Backup
There are two technology options for corporate level backup – push and pull.
a) Push mode backup: Backup agent runs on individual PCs to collect the files, packages, optionally compresses and encrypts, before sending to backup destination. It is more

  • Secure: data can be encrypted before sending over the network
  • Bandwidth Efficient: Data can be compressed before sending over the network
  • Faster: Backup effort is distributed across the workstations, thus achieving better overall throughput
  • Better Load Balanced: Backup effort is not put on a single machine and is distributed across multiple machines

b) Pull mode backup: Backup software pulls the data from individual PCs and then backs them up. All work is done by a single “Backup PC”.

WinBackup Business: Advanced Backup Options

  • Backup open files too (except on Windows XP, Windows Server 2003/2003-R2); no need to close files before backup.
  • Use Backup-on-the-Go for users who remain disconnected from office network for major part of the day (e.g., sales, field support team members etc).
  • Most backup applications fail to backup if the target computer is not in network at the scheduled backup time, but WinBackup Business backs up at the scheduled time to a local disk and automatically transfers the backup archives to backup destination when the computer is back in the network.
  • Configure for the powerful AES encryption for every backup, so that only the intended user can open the backup files.
  • Get email notification when the backup process completes on a computer.

WinBackup Business: Advanced Restore Options

  • Restore operation on a large archive can take quite some time. WinBackup Business can notify through email when restore is complete.
  • Restoration is decentralized, thereby enabling respective users to restore according to own requirements without bothering the administrator.
  • Restoration is made easy using Structure Restore. Restore the latest version of all the files in one go.
  • To restore an older version of a file, use the Restore Point functionality.
  • To restore all files created or modified on/before a certain date, use the Advanced Filtering option.

Backup is essential for computing. Some of the backup tools are hard to use, or inflexible, or expensive, or all of those. Now you have options that work well and do not cost much.

Why use Optical Character Recognition in Various Scenarios

What is OCR?
Good quality Optical Character Recognition (OCR) is an unquestionably effective tool for extracting textual content in machine readable and editable format from a digitized image. The following steps are involved in the process – from scanning a paper document to getting the textual content out of the scanned image.

1. Use a compatible good quality document scanner to scan a paper document into suitable electronic format, such as JPEG, TIFF, PNG, PDF etc.

2. Preprocess the scanned document for skew correction, noise removal, and some mathematical transformations.

3. Perform Line Segmentation from the processed document, i.e., segregate each line from the scanned document.

4. Perform Word Segmentation, i.e., segregate the words of each line from the previous step.

5. Perform Character Separation, i.e., separate the characters of the segregated words from the previous step.

6. Perform Character Recognition, i.e., recognize each of the above segregated characters by looking into the pattern. Once the pattern are recognized the corresponding textual character is obtained. The pattern recognition may involve different types of methods, such as statistical analysis, neural networks, structural matching etc.

7. From character to word, from word to line, and from line to the entire document becomes recognized and its textual version is obtained.

In short, this is the main process and objective of Optical Character Recognition (OCR).

There are different OCR products – some commercial and some open source, such as the widely used / known “Tesseract“, which is distributed as part of various types of product offerings from different companies. SARANGSoft offers a version of Tesseract as a free download from its website. It enhances the functionality of digipaper (https://sarangsoft.com/product/digipaper) by adding the OCR functionality.

Different Types of OCR
OCR is available for different languages, including English. Several successful researches have already been done and are still going on. Sometimes a document may contain multiple languages (such as forms). In that case OCR become more complex. An OCR engine for one language, such as English, is not sufficient in that case. The output textual content also would show textual form of different languages / scripts. Hence, this is known as Muitisctipt OCR as opposed to Single Script OCR in the simpler case of only one language / script.

It is also important to keep in mind that the textual content may be in printed form, or in handwritten form, or both. Very few OCR tools are effective in recognizing handwritten text, and most OCR tools focus on and handle only printed documents.

Teeseract supports recognition of texts in multiple languages. However, our experience is more focused on the English language. From what we have noticed, Tesseract doesn’t handle mixture of languages too well.

Accuracy
No OCR tool can guarantee a 100% accurate output because of different reasons:

  1. A document might contains clean text or it can be a a bit fuzzy / unclear / noisy, especially if it’s an old document where the contents are fade. Unclear / noisy documents should be run through some preprocessing like color and brightness adjustment, noise removal, filtering etc. to improve its clarity. Cleaner the document, more accurate the OCR result.
  1. Documents in grid format (i.e., in rows and columns) may also cause problems in accurately recognizing the textual contents. At times some of the (printed) texts go across the gred cell boundaries, thereby being more prone to error during recognition.
  1. Scanning of the paper document is an important factor in the recognition process. If the document is shrunk or skewed by a significant level, the preprocessing may not always back the document in its actual form. This also often lead to inaccurate OCR results.
  1. Recognizing handwriting is always hard, especially it’s cursive, in which case special processing (including Neural Networks techniques) may be needed to recognize the characters from the flowing nature of the writing. Bad handwriting is hard to recognize (both by humans and machines).

Use
In the business world, OCR can be used for long-term digital storage of important papers, such as invoices, financial statements, reports, legal documents, applications, permits etc., especially if the volume of such papers grows too much. In some organizations, there are dedicated staff members to manually enter the details from those paper documents into digital formats (e.g., spreadsheets) to collect the data and make it available for processing. Another reason for digitizing paper documents is safekeeping and organizing for the long-term storage. If the digitized documents are properly “tagged”, an added benefit is the ease and speed of retrieving those when needed. However, manual entry of data and/or manual tagging is both error-prone, time consuming, and often inconsistent between individuals. Also, any error in manually entering data can cause major problems for the organization. OCR can be a great help in reducing the use of manual steps in the overall digitization process. The textual content from the scanned document images can be extracted using OCR. Then those can be put into appropriate places, such as in database fields, rows and columns of spreadsheets, or as “tags” for the concerned documents. Please note the word “reduce” above – a lot organizations think that by using OCR they can completely get rid of the manual intervention, which is a bit too much to expect. As no OCR is 100% accurate, output of OCR needs to be manually verified as appropriate for the case. That means instead of “entering” every data item manually, there is need “check” the correctness of OCR-generated data. It’s not safe to blindly trust the OCR output in a lot of cases.

In Banking sector, OCR may be used to recognize check number, account number, bank name, routing number etc. In Educational sector, OCR may be able to help with form processing. In any domain OCR can be used as a part of the digitization process to extract textual content after the scanning and use the textual content as (secondary) tags to later identify the documents.

In short, though OCR reduces manual data entry, thereby saving time and reducing / avoiding errors to a good extent, at the same time keep in mind that OCR output should be carefully checked before using it any ciritical purposes.

Threat to an IT Network from End User Activities

Background
A major goal of managing IT networks is to guard against security breaches. A hardware or software asset needs to be monitored on a regular basis, if possible continuously, so that appropriate preventive steps can be taken to keep the IT infrastructure running well as well as secured, because an organization’s success and reputation depends on its IT systems being protected. Though security threats may come from improper management of hardware and software assets, it is quite likely to be caused by end-user actions as well, such as unmonitored user activities.

What is meant by ‘Monitoring User Activity’
What does it mean to monitor “user activity”? It means checking for uncommon, unexpected, suspicious actions by users, including use of (specific) computers, network shares, applications, services, data etc. within the network. Being able to quickly identify any system misuse is an effective security mechanism, which might enable to stop an attack, and clean up any fallout.

In an IT environment users take many actions as part of day-to-day activities, such as run various applications, collect / create / process data, install & uninstall software, request to upgrade hardware & software etc. As part of managing the network, IT Administrators deploy new versions / patches of Operating Systems and applications, add and/or replace components / peripherals. The combination of existing software in the network and certain user actions might unknowingly open the door for security problems, such as attempt to hack the computers, copy / alter / delete data, download virus / malware etc. Sometimes these problems are inadvertent, but deliberate actions to compromise network security is possible, and are not uncommon. Effect of any such security breach can be devastating for an organization – ask the dozens of high-profile companies in the news for the wrong reasons over past couple of years! A potential problem indication can be as simple as a particular user logging into / trying to log into a computer (server / desktop) that is not expected or at an odd hour (beyond normal office hours), or a USB drive being plugged into a computer and such. At times it could be a genuine requirement, in which case the red-flag can be reviewed and discarded. In the other cases, that’s the main clue to track down and fix the problem. Being aware is essential to protect anything, or at least to assess and address any damage.

Challenges of monitoring user activity
Manual tracking of these events are hard to start with, and it gets increasing complex and time consuming. What is needed is an automated process that tracks users’ activities in detail as required. Every organization’s network has its own requirements, priorities, challenges. Accordingly, the relevant events can be setup to be monitored and alerts can be raised for review by the IT administrator.

But how to monitor effectively?
Finding the proverbial needle of security-threat in the haystack of activities is challenging. Automation is a viable way of identifying potential issues and narrowing down the list to actionable items. It’s not just the power of recording all possible actions & events in the network and analyzing those, the flexibility and ease of fitting the tool into an organization’s own requirement is just as important.

SARANGSoft SysExpertez is an application (Windows IT Asset management) that does this monitoring efficiently and with ease. SysExpertez enables the IT administrators to setup alerts on important user activities or even various status (e.g., a disk drive free space falling below a level, System Thermal State, System Power Supply State), so that all these events are reported with details, which can be reviewed as report as needed. A number of such reports are available in the ‘User Activity Reports’ section. Here are some of the reports generated on user activities in a network:
  • Currently Logged-in Users
  • Currently Logged-in Users by Computer
  • Users’ Login / Logout Times
  • Computer ON Status
  • User Logged into Different Computers
  • Users Logged into a Specific Computer
  • USB Device Plug-in
  • USB Device Plug-in by Computer

This variety of reports on end-user activities provide a good idea as to what is happening in the network related to the end-users. You can get a report as a whole as well as use ad-hoc queries regarding specific users or particular activities.

Detecting Prohibited Software

What is a ‘Prohibited software’?
Prohibited software refers to a software program or application, which is deemed inappropriate and not allowed to be installed in any computer in a specific IT environment. The reason in general are security vulnerability / threat that it can create to that computer or even to the entire IT environment. It’s a kind of banning or blacklisting a particular software for a specific IT environment. The reason for putting a software in such a category differs from organization to organization, depending on the domain of operation, data sensitivity, security concerns etc. Therefore, a software that is “prohibited” or blacklisted in one organization might be freely used in another, and vice versa. However, certain kinds of software are more likely to be marked as prohibited in workplaces.

Importance of detecting ‘Prohibited Software
Computer users of an IT network often install different kinds of software – within the organization (e.g., on a server share), from a friend / colleague, download from the Internet, and such. Many of these software turn out to be failing in the security standards and cause vulnerability / threat to the entire network. New software of various types and increasing complexity are emerging on a regular basis. There are a lot of free software available on the Internet that are quite useful. For example, Adobe PDF Reader, Internet browsers (Firefox, Chrome, IE, Safari, Opera), Skype etc. are from highly reputed software publishers and widely used at home as well as in small to large organizations. There are also many popular games, media players, chat applications etc. From our own experience, a widely used “free” media player app also tries to install a bunch of other software, and even if you opt out of all of them, they still silently slip in a couple of questionable software into your computer! First of all, these are unknown software, on top of that they sneak into your system. How comfortable / happy does that make you feel? In most cases, the adverse effect of such software on the IT system is unknown for a while. As a result, IT network security threat increases significantly as more such software are installed. Software from commercial software providers are regularly reviewed and updated, but some of the commonly used software do not go through proper follow-up and are rarely updated. Any security flaw in such software remains and create a backdoor for hackers and malicious programs to penetrate an organization’s IT network.
On the other hand, there are some popular software including gaming, media, and social networking, usage of which is likely to affect focus and productivity of employees. Presence of such software in workplace can also lead to various compliance issues. Also, if employees in an office downloads various software from the Internet and installs on work computers, it can lead to serious legal issues, such as license violations.
That’s why every organization needs to know what software are installed on its computers and if those are required and acceptable for business reasons. If not, such software should be identified as “prohibited” and arrangements made to stop those from being installed on any work computer. Controlling software installation is not a choice anymore; it’s a required step to address security, productivity, legal, and compliance issues.

Importance of ‘Software Asset Management’ in this regard
Detection of ‘Prohibited Software’ is a part of the bigger area ‘Software Asset Management’. Software asset management (SAM) is a business practice that involves managing and optimizing the purchase, deployment, maintenance, utilization, and disposal of software applications within an organization.
Proper software asset management is necessary for effective security practices to help combat cyber-attacks that can damage an organization in various ways. An effective SAM practice delivers intelligence on software across the network, providing clear visibility of entire network inventory that helps Network Administrator to take more informed software security decisions. SAM helps to minimize the attack-surface of an organization by detecting unauthorized and unsupported software and preventing them from being installed, or at least to be removed.

Methods of tracking ‘Prohibited Software’
Traditionally, the method known as ‘application blacklisting’ is used to track the unwanted applications. This method works by maintaining a list of applications that are to be denied system access and preventing them from being installed and executed. However, since the number, variety, and complexity of applications are increasing day by day, that approach is hard to follow these days.
The opposite approach to ‘blacklisting’ is ‘application whitelisting’. In this approach, an authorized list of applications is maintained. When a new application is going to be installed, it is automatically checked against the “authorized list”. If the application is not in the list, it’s not permitted to be installed. This depends more on the honor system.

Are these methods full-proof?
Nowadays applications are coming with increasing levels of complexity, variety, in increasing numbers. So, ‘application blacklisting’ process is not likely to be full-proof. On the other hand, the ‘application whitelisting’ method also might not be practical, because of the administrative resources required to create and maintain an effective whitelist often turns out to be inadequate.

Any way out of this problem?
Considering the possible threat to the IT network, it’s not recommended to rely on manual processes to detect the unwanted software. Rather we have to rely on an automated system that can detect such applications automatically without any manual intervention – a system that continuously monitors the IT network and immediately informs about the presence of any unknown or unwanted software.
SARANGSoft SysExpertez provides this functionality along with full-fledged IT Asset Management (tracking of hardware, software, and users) in a Windows network. Let’s see how SysExpertez helps detect the unauthorized / unwanted software within a Windows network.

Role of SysExpertez in detecting ‘Prohibited Software’
SysExpertez categories installed software broadly in three distinct categories.
1. Licensed: is associated with legal copies of commercially published software from reputed providers, license for which are purchased and budget is allocated for such software to be renewed / upgraded; e.g., Microsoft Office, SQL Server, Adobe Photoshop, Oracle database, AutoCAD etc.

2. Approved: There are many free but wonderful software available. Depending on an organization’s needs and policies, its IT team can identify some of those as “Approved”; e.g., Adobe PDF Reader, Skype, Firefox and Chrome browsers, some text editors (like Notepad++, TextPad) etc., which are suitable / beneficial for use in workplace;

3. Prohibited: There are some software that an organization might choose not to allow in its network for various reasons – security threat, productivity loss, legal / compliance issues etc. These generally include games, media players, chat apps etc. Any installation of such software within the IT network should be detected ASAP, and immediately acted upon (such as uninstall and prevent future occurrences);

SysExpertez helps put the known and relevant software into one of above three (3) categories – Licensed, Approved, and Prohibited. If any software outside these three lists is installed on any computer within the network, SysExpertez can detect that, classify it as an “Unknown”, and immediately notify the IT Administrator about it. The IT Administrator can investigate the case, and either

  1. Accept it as one of the first two categories (i.e., Licensed or Approved), or
  2. Put it in the Prohibited category and instruct the user(s) of the concerned PC(s) to immediately uninstall the software (and refrain from installing it in the future).

Monitoring of software assets helps keep the network safer and comply with legal and standards requirements.

How important is to know the network inventory?

What is ‘Network Inventory’?
An IT network consists of various types of hardware (client PCs, servers, printers, and other peripherals) and software as well as the users. The hardware and software are commonly referred to as Network Assets, which constitute the entire network inventory.

At the simplest level, network inventory is a basic list of devices connected within the network. However, at a more advanced level, it can evolve to contain detailed information about software installed, hotfixes applied, services, and much more.

How important is to know your Network Assets?
Managing the IT infrastructure of an organization is undoubtedly a challenging task. The assets in the network get deployed, updated, removed fairly frequently, and often without any set pattern, to support the operational needs of the organization and the overall computing environment (security issues, virus / hacker threats, product updates and enhancements etc.). Keeping track of the users and their access privileges is an integral part of IT management. One of the biggest challenges to managing the network is the lack of comprehensive knowledge and understanding of the network, which are essential for decision-making and planning about the growth and improvement of IT infrastructure.
If you are a network administrator, you have to face these common questions:
  • How many computers (client PCs and servers) are in the network (domain or workgroup)?
  • Which of these computers are active vs. inactive, have been added / modified?
  • What hardware components (CPU, RAM, motherboard, hard discs and partitions, network card / chip, video and audio card / chip etc.) are in those client PCs and servers?
  • What Operating System (Windows) version is running on each PC and server?
  • What Service Packs for the OS have been installed on each PC and server?
  • What software applications (including version, manufacturer etc.) are running on each PC and server?
  • What all services are running on each PC and server?
And many more like these. Without these details you will never know the actual state of your network. Proper network asset management is impossible without the knowledge of the network assets.
What exactly is “IT Network Asset Management”?

IT Network Asset Management (also called IT Inventory Management) is an important part of an organization’s business strategy. It involves collecting detailed hardware and software inventory information, which are used to make decisions about purchase as well as redistribution of hardware and software over time.

IT asset management helps an IT organization manage its systems more effectively, and saves time and money by avoiding unnecessary asset purchase and/or disposing off existing resources.

How do organizations manage their network assets? Is it sufficient for them?
It’s quite common for System Managers / Admins to manually monitor the entire network, at times with dedicated personnel. That’s a challenging task, which is repetitive, error-prone, time consuming, and to a high degree wasteful of qualified systems professionals. A tool that automatically monitors the network for such information and presents a consolidated view helps with the latest status as well as not take up important human resources for such tasks.

SysExpertez: A solution for network asset tracking, monitoring, and management
SARANGSoft SysExpertez is a comprehensive asset, domain, and operations management application with a number of exciting features that help manage IT assets, Active Directory domains, and operations in the network. It automates and simplifies the repetitive tasks and quickly provides accurate results through an easy-to-use interface.

It’s like a set of CCTVs within your network, so that you as the System Manager / Admin can get a full view through the “Admin Console”, as if sitting in a “Control Room”. The powerful Admin Console is super-easy to use with a simple menu-driven UI that also looks and feels great.

Preventing data loss on your computers

Introduction
Data have become intrinsic part of modern human life. We are constantly searching for data, right from the time we wake up every morning. While some of the data are live and online, a lot of data are collected, processed, organized, and stored for quick and easy access at any time. These data (stored in files and folders) are valuable for our personal needs. Those can be photos, videos, music, research outcomes, write-ups, important documents and so on. If those are lost for any reason, it would significantly affect our lives, professionally and/or personally (often emotionally). That’s why it’s easily understandable why we often fear of losing such data due to some unexpected problem.

Types of data loss and some precautionary steps
Though we often think about “data protection”, which includes guarding it against preying eyes and hands of hackers and such, “data backup” is intricately involved in the process. The term ‘data backup’ means to copy data files to another medium (such as a disk or tape) as a precaution, in case the original storage medium (generally the hard disk built into the computer) fails. Data backup is crucial for businesses as well as individuals.

There are many ways that your data can be lost. The common reasons are hardware failure, corrupted files, virus / malware, accidental deletions, and of course natural disasters (storm, earthquake, flood etc.) or man-made disasters (vandalism, theft, terrorist attack, arson etc.). Let’s look at few safekeeping approaches to prevent data loss as part of a comprehensive data protection plan.

a) Create a standardized file / folder organization
It helps to develop a standard way of organizing and storing your files, so that you (and your users) will know where a particular kind of file are expected to be. Once this first step is done, backing up data files will be more accurate and precise, and it will save time and hassle while retrieving any lost data to its original location.

Organizing files and folders is the key to a data protection and restoration plan.

b) Identify which (kind of) files need to be preserved
Once you have organized your files and folders, determine which are important for you. Though you are the best judge deciding what are your important files, here are some ideas for your convenience.

The following types of files are important:
  • The files you can’t do without
  • The files you will need in the future
  • The files related to products & services you sell (for businesses)
  • Files that you cannot re-create
  • Files that you can re-create but don’t want to
  • Files you regularly use and/or refer to and/or update

On the other hand, the following types of files are less important:
  • Files you have not used (not viewed or edited) for a few years.

The following types of files might be good candidates to not be included in backup (or should even be deleted from your computer to keep it clean):
  • Files you cannot remember why those are there.
  • Files you know are not useful for you any more or are known be outdated.

c) Avoid storing documents on the same drive where Operating System is installed
On Windows, most document editing applications save the document file in the ‘My Document’ folder, which is very well known. As a result, malwares and virus often target the files there, making the files vulnerable.

Whether it is a virus or software failure, the majority of computer problems affect the Operating System. Quite often the solution is to reinstall Windows, and at times after reformatting that drive. In such an instance, you must make sure to copy / backup all of your own files (not the system or application files) from the drive, including the ‘My Documents’ folder; otherwise everything on the drive will be lost. You can create a separate drive on the same physical hard disk, and store all your own files and folders on the second drive. If the OS drive needs to be reset, your data drive will still be unaffected.

It is also possible for the hard disk itself to go bad (disk crash), in which case all drives on that disk will be lost. You can replace the hard disk and reinstall Windows and the applications to get it back to working condition, but in this case your files and folders on the data drive has also been lost. To handle such cases, you can use an external hard disk to store your data files. Or you can just use regular backup from your data drive to an external disk.

d) Backup regularly
You can alert yourself to take a set of security measures to protect data loss, but if your data is not backed up, it’s very likely that you WILL LOSE IT. So, ensure that your data is backed up regularly, and test the backup to ensure that your data can be recovered when you need it.

How often should you back up? That depends on how much data you can allow to lose if your system crashes completely. A week’s work? A day’s work? An hour’s work? Depending on that you have to schedule your backups.
There are numerous backup programs with varieties of features. You can easily try out
  • SARANGSoft filexpertez (file-expert-ease) for backing up a Windows PC. It’s a comprehensive file and folder management tool for home, office, school / college, everywhere.
  • SARANGSoft WinBackup Business for backing up all PCs and servers in a Windows network (domain or workgroup) through a centrally managed arrangement.

Both the products are feature-rich and flexible, yet easy to understand and use. These do not cost much, and there is a no-obligation 30-day free trial available.

e) Automate your backup procedures
All of us are busy. There are too many things to do every day, and too little time! Even though you might be very sincere about regular data backup, it’s quite possible that you forget to run backup at times, and that leads to an inconsistent data backup arrangement. Ideally, backup should be arranged to run in a consistent manner without any manual intervention. Depending on the importance of your data, you may schedule the backup operation to run it automatically. The only thing you should bother about is to check that the backup are really happening. It helps if the backup program can send you a notification when it backup is done, either successfully or ending in failure (in which case you can look into the issue and fix it).

f) Encrypt your data while backing up
Using encryption during backup of your data is another layer of protection for the data.
Encryption changes the backed up data in a way to making it unreadable by anyone, except who has the password “key”, which allows him/her to decrypt the data back to its original usable form.
There are various types of encryption mechanism available, and some programs use it.

g) Create a local backup arrangement
All the important files should be backed up locally first. Make sure that the backed up files are available at your office / home. That ensures for easy access and recovery, as well as control of the data.

h) Create an off-site backup arrangement
It’s a great idea to arrange for a different location than your office / home to keep a copy of the backed up files. It provides “redundancy” as well as prepares for “disasters”.
If the local backup is damaged or lost for any reason, the off-site backup copy will save your day.

i) Use of “cloud” as remote storage for backed up data
Nowadays, it’s increasingly common to use cloud as the remote data storage. There are many benefits to using cloud storage, most notable being the virtual indestructibility of cloud storage and its accessibility. Files stored in the cloud are assured beyond any other level for reliability and those can be accessed at any time from any place with Internet access and your own user credentials. As far as the disaster recovery is concerned, data from cloud can be restored without any hassle. Also, the cost of cloud data storage and restoration is significantly lower than traditional data storage and restoration.
SARANGSoft CloudScape is a unique cloud storage browser for the Windows platform to seamlessly integrate cloud storage (AWS-S3 and Azure) with local storage (PC’s hard drive). Its Windows Explorer-like user interface enables easy transfer (including drag & drop) of files and folders to and from cloud, thereby making cloud storage an extension of your local PC storage. It maintains full folder hierarchy between a PC and cloud storage, which is not very common for such tools.

Ending Note
Making plans and implementing those takes time, effort, resources, and costs money. That’s why many of us defer doing it. However, the cost of not backing up data can be so severe, the upfront effort for the backup process is worth everything you put into it.

VW emission scandal: How prevalent is QA/QC cheating?

The scandal around Volkswagen cheating US EPA’s emission test process is known to almost everyone who follows current news.  To recap the known facts about 11 million vehicles (with 4-cylinder diesel engine) are affected around the world. Recalling and fixing those will cost VW about US$7.3 billion.  The company might face fines as much as US$18 billion by the US government.  Sales of all vehicles, including VW and Audi, that might be affected by this have been stopped, at least in the USA (not sure what’s happening around the world).

Now there is news that other car companies might be using the same / similar tricks to fool the emission testing process.
http://www.thefiscaltimes.com/2015/09/22/Volkswagen-May-Not-Be-Only-Car-Company-Cheating-Emissions

Shocking, right?  Hardly.  Businesses use all kinds of tricks to get past regulatory / legal requirements, and some supposedly pro-business people — from that world as well as politics — consider that necessary for doing business (or succeeding in it).

I would like to share an experience (from around 2009) we had with a prominent manufacturing company based in Kolkata, India, where our offshore software development center is located.  This company builds and supplies important parts for big name heavy machinery makers around the world.  Their production process seemed really old-fashioned — very manual, quite unmanaged, clearly inefficient, but still they supposedly were profitable and busy filling large orders.  In fact, the problem was that this company had a lot of orders to fulfill, but often their parts didn’t pass QA / QC due to defects in the production process, which led to the finished parts being rejected by their customers.

Our company was supposed to build a system to monitor their product testing process via automated data acquisition, check if technical requirements were met, and record results, which would be reviewed later on.  In fact, this system was supposed to be built as audit-ready, i.e., the company will show the recorded results to its buyers as proof of tested and passed products.

During one of the many discussions, one of their managers asked us to keep a backdoor to access and modify the test data, if needed.  The idea, he explained, was that say plating of a part was supposed to be between 70-72 micron thick.  But sometimes it might happen to be 68 micron (or may be 75 micron) due to some defect in the production process.  The automated system would measure the actual thickness and store it in the database, from which the reports will be generated for review and audit.  This company wanted a backdoor to the database backend, so that they could manipulate the stored thickness data if those were not within the permissible range, but came close, i.e., in the 67-69 or 73-75 range.  That means the Quality Control process for those parts are being compromised.  Is that a lot?  No.  Does it affect the final product quality, performance, reliability etc.?  I don’t know.  Is it dishonest?  Absolutely, no doubt.  I am glad, and relieved, that we didn’t happen to do that project for various reasons.

This must not be the only occasion, or area, where this company cut corners, and many more like it do the same, all across the world, on a regular basis.  Sometimes they get caught and/or it blows up in their face, there is a lot of hoopla about it, and then everyone goes back to normal life, without knowing how many more such corner cutting are going on all over the place.  It happens in construction, in production of household goods to electronics, in medical — medicines and devices, in automobiles (as in the case of VW), and almost all industries.

What can we do about it?  I am not sure.  These cheating systems (software — yes, that’s why I picked up this topic) have been developed by some software (and hardware?) engineers, who probably knew what was going on, but did it as part of their job.  If we individually do not stand up for what is right, we will collectively keep on suffering the results of all of our misdeeds.  Could we have stood up and said No to that customer’s demand for the “backdoor”?  As a small software company, under serious pressure at that time, the potential new customer was quite important for us.  To be very honest, I am just glad that (potential) customer ultimately decided not to work with us on that project.

I am sure a lot of people have encountered deliberate and willful compromises with quality, reliability, performance of products they use and depend on everyday. So, VW’s case is not shocking (in a way), because it’s not that we didn’t know something like this could be happening for some product or the other.

Importance of Electronic Document Management System

Importance of Document Management
A document – personal, professional, official or academic, has always of importance to us. It’s an integral part of our life, depending on which we carry on our daily activities – at home, institutions, and offices. However, keeping track of and maintaining all these important documents is not an easy task. Who hasn’t been through the search for important data and documents, the common hunt for the latest information or document, and the problems faced when those cannot be found? This document management has become a high priority in our daily life. A good document management system helps organize documents in efficient manner, so that any document can be found whenever needed.

Why Electronic Document Management System came into being?
Document management started as a manual process (when documents were only paper-based), which included storing documents – sometimes with paper-tags / markers of different colors – in folders and boxes, and then filing cabinets / racks. The color coding by tags helped differentiate / identify categories of papers / files while searching for specific documents. These manual processes often turned out to be inefficient, error-prone and time-consuming, but still that was the prevalent process in the absence of any other option. With the advent of automation and computing in our daily life, we naturally turned towards computer-based document management system to take advantage of computer’s speed, capacity, efficiency and error-free operation. A number of different document management systems came to the market, some convert paper documents into digital form and then manage those. Thus Electronic Document Management started taking off.
digipaper is an innovative and versatile software product to manage the growing accumulation of both paper documents and digital documents to create archives or repositories for convenient and efficient operation at any organization.

A common shortfall in digitized document management
Digitization of documents generally focus on converting paper documents into scanned images and storing that. Most of us understand the importance and benefit of digitizing paper documents, but that alone does not fulfill the requirement of managing all digital documents within a ‘repository’. Other than the paper documents, in every office endless digital documents of varieties types (documents, spreadsheets, presentations, images / photos / logos, designs, forms, templates etc.) are created every day on computers. All these files have their own value as per the organization’s operation, rules / policies, and future plans. These files are edited / changed often as needed, thereby creating multiple versions, distributed among different employees or even external parties (such as partners, customers, suppliers / vendors etc.). It happens that a particular file is changed a number of times and all the versions have to be tracked, so that a specific version can be found when needed. That’s why proper design and organization of archives / repositories, along with versioning, is essential for a good document management system.

digipaper as an Electronic Document Management System
Along with converting paper documents to digitized format as well as tagging and organizing those in hierarchical structure, digipaper also helps an organization create a “repository” of all types of important electronic documents, with all their versions, in a central location. Then any version of a document (and its full history) can be found in one place within a very short time. This helps avoid the problems of using outdated or wrong documents / data. It also minimizes time wastage and frustration of searching through endless unorganized documents as well as helps control access for different users. Once a proper document management system is deployed in an organization, the operational efficiency, quality of work, employee and customer satisfaction are clearly visible almost immediately. The benefits grow even more as time passes by and more documents are brought under the system to be managed.