Career in Data Analytics

Career in Data Analytics: 4 non-technical aspects people don’t think about

This article has some important tips for you if you want to build a career in data analytics.

Certain universal truths cannot be denied. The sun will always rise in the east and set in the west. The earth will continue to revolve around the sun for the next 5 billion years. Free from any impurities, water is always tasteless, colourless, and odourless. Here’s another fact that has recently made its way into this club of truisms: the world is, and will increasingly be, driven by data.

You probably didn’t see that coming, did you? I’ll posit that the only reason we don’t pay such close attention to data is because of the sheer quantum of it surrounding us today. In 2018, IBM estimated that more than 2.5 quintillion bytes of data were generated across the globe every day.

The reason behind generating a huge amount of data

Because every action we take, every action we don’t take, every bit of content that we consume, share, or reject all of it adds to the massive and growing pile of information that is created every second of every day. Data is like air essential, yet invisible and we can’t seem to stop generating it.

And, with the growing integration of digital technologies and IoT devices in our lives, the speed of data creation is only set to accelerate in the future.

Naturally, with such a data inundation comes transformation. Enterprises are already tapping into the benefits that a data-centric approach can deliver for their business and are eagerly building their data capabilities to capture the opportunity that looms just over the horizon.

Marketing, strategic decision-making, sales, human resources, supply chain management each and every business function is actively being touched and redefined by data.

Data is changing the job landscape

It is also changing the jobs landscape. We now have specialised roles such as data specialists, data architects, and, for large enterprises, even Chief Data Officers (CDOs). For professionals, the field of data analytics and business intelligence is interesting and lucrative; data-related roles such as data scientists have been consistently highlighted amongst the most promising jobs of the future and currently offer the highest median salaries across the globe.

So, what do you need to benefit from the opportunity that this high-growth sector represents? There are a thousand and one informative blogs about the technical skills that you can acquire to become a data professional. This is why I will focus on the softer aspects, learnt from my personal experiences in this domain that will help you craft a career in the field of data analytics:

Tips to build a career in data analytics

One of the biggest learnings that I’ve had in my career is the need to question everything. It is the part and parcel of the job to identify how to take a process that is working perfectly fine and make it better. This is why a data professional can never, ever, ever be satisfied with the status quo nor should they be. Inculcate a habit of evaluating everything that you observe and use data to back up your observations. The analysis at all times starts with you.

The ability to constantly learn and augment one’s knowledge base is a skill that is of the utmost importance to a data professional. The field itself is dynamic, fast-evolving, ever-changing. To succeed, you will need to stay in step with or maybe even ahead of its evolutionary curve. And the only way to do that is to constantly learn, unlearn, and relearn.

As a data professional, you must remember that there is no fixed path to finding solutions to problems. 1+4’ is just as much a 5’ as 2+3’ is, which is just as much a 5’ as 1+1+2+1’. Even if the end-objective achieved is the same, sometimes one solution works better than the other one. This is why you need to have a flexible and agile mindset that can break down a large problem statement into multiple smaller parts and figure out the solution that delivers the best results. It is also essential to be ready to switch from an idea that initially worked but no longer does, or to try newer ideas that might work better. The question isn’t if’; it is how’.

You might be a wizard when it comes to analysing data, but it will amount to little if that analysis cannot be translated into simple, actionable insights that can be used by the end-user be it a client or an internal stakeholder. Look at how you can simplify and personalise your analytics reports as per the needs of your end-user. Do it constantly and it will become second nature. The objective is to help your end-user make the best decisions backed by data-driven insights that they can readily understand and implement.

Let’s end as we began: with another universal truth. Change or, better yet, entropy will always be the only constant. As a data professional, your entire job revolves around how to best navigate that change, make life simpler for everyone who depends on you, and add value to processes and functions.

Cultivating these soft skills can help you achieve these objectives and complement your technical skillsets to create a lucrative career trajectory in this specialised, high-growth field.

Source: Tom Ricks

 


 

Data V Tech is proud to be one of the leading ERP vendors in the Asia Pacific. We have implemented Epicor ERP for many enterprises and organizations in Vietnam and China. For direct consultation, please feel free to contact us.

Pharma Pharmaceuticals deploys Epicor ERP

Pharma Pharmaceuticals deploys Epicor ERP

Epicor Software Corporation, a global provider of industry-specific enterprise software, has announced that Pharma Pharmaceutical Industries (Pharma)—a leader in the kingdom’s pharmaceutical services industry—has implemented Epicor ERP to enhance data capture and archiving capabilities, optimise operational efficiencies, and guarantee on-time delivery to customers.
Pharma brings together business leaders and healthcare veterans to offer services to the Saudi pharmaceutical industry that include branding, marketing and sales, warehousing and logistics, facilities management, and regulatory compliance consultancy.
The company’s international reach demanded a new approach to technology to ensure it remained a leader in the new global digital economy. It embarked upon a bold digital transformation journey, with a robust ERP platform as the planned keystone.
“We didn’t have an ERP platform in place and relied on a paper system to manage each department’s activity,” said Tariq Kayyali, quality unit director and ERP project manager, Pharma Pharmaceuticals Industries, whose team of stakeholders focused on linking and integrating department transactions to reduce the time taken to locate vital archived data.
Errors in starting materials and the resulting mix-ups and delays in order deliveries became significant business risks to the company, pre-digitization. The risk of using incorrect or expired material in the manufacturing process was compounded by the company’s inability to accurately control all its assets and due to its tendency to produce inaccurate reports due to manual compilation.
With the support of trusted Epicor partner, Full Insight Technology Solutions (FITS), Pharma deployed a platform that was easy to install, user-intuitive, and provided tight-fit functionality with its needs. Over a period of 10 months, the system was rolled out to 15 users, who all reported ease of use and unprecedented accuracy.
Switching to Epicor ERP has allowed Pharma to smoothly link and integrate transactions across all departments and enhance accuracy in operations and inventory control. Strict audit trails now allow every critical transaction to be traced—to the user, date, and time of action—allowing Pharma to closely monitor related business impacts. Labelling problems have also been overcome, by enabling greater control over purchased and manufactured parts.
“We ended up saving about 30 percent of unnecessary warehouse-team transactions and managed to reduce the time taken to find or track historical data records or transactions, from hours or days to seconds or minutes,” said Kayyali. “We also reduced the risk of mix-up and eliminated the possibility Pharma Pharmaceuticals deploys Epicor ERP to streamline processes and optimise service delivery of using an invalid or expired part as a starting material—which is critical in the pharma industry.”
Having a validated ERP system helps us meet the expectations and compliance requirements of medicinal product agencies, both in the Kingdom of Saudi Arabia and around the world.”
“Remaining relevant in the digital economy is no minor feat,” added Amel Gardner, regional vice president, Middle East, Africa and India (MEAI), Epicor.
“Competitors—especially new market entrants—will not be using manual processes for critical functions. It is therefore vital that all firms digitise as much as possible. Epicor ERP will give Pharma a strong platform for growth and enable it to easily comply with granular industry requirements. The platform is designed to fit every business like a glove, delivering automation and operational enhancements that pave the way to true digital transformation.” — Tradearabia News Service

 

Data V Tech is proud to be one of the leading ERP vendors in the Asia Pacific. We have implemented Epicor ERP for many enterprises and organizations in Vietnam and China, particularly in the pharmaceutical industry. For direct consultation, please feel free to contact us.

Learn about Epicor ERP in the wake of COVID-19 and get ready for the future.

Bridging the gap in data analytics by machine learning and AI

Bridging the gap in data analytics

Machine Learning and AI are two exciting application areas that enable programs to automatically learn and improve from experience

Location data has become by default for businesses as well as private consumers. Although, consumers are experiencing exponential digital developments through major technology giants, the digital transformation for governments and industries is slowing down.

A major survey by McKinsey shows that less than 1% of all data collected is analyzed. Data grows quickly, and all data has a location factor. However, only a fraction of this data is analyzed for smart decision making. We translate data from the dynamically changing environment to make data-driven decisions. This is done by converting real-time data into usable information; while self-learning algorithms embedded in our solutions help continually improve predictions.

Machine Learning and AI are two exciting application areas that enable programs to automatically learn and improve from experience. Our customers have greatly benefited from Machine Learning to create sustainable and light solutions.

Towards Sustainability 

In the Netherlands, the government has an initiative called ‘common ground’ that strives to create a future-proof municipal IT infrastructure. IMAGEM contributes to this initiative for all government customers through the VALLEY – a concept of reusing applications and pay per use models. Our model supports the development of society in collaboration with government, industry and citizens.

Source: Wouter Brokx


 

Data V Tech is proud to be one of the leading ERP vendors in the Asia Pacific. We have implemented Epicor ERP for many enterprises and organizations in Vietnam and China. For direct consultation, please feel free to contact us.

COVID-19 Pandemic as a Big Data Analytics Issue

Understanding the COVID-19 Pandemic as a Big Data Analytics Issue

Big data analytics techniques are well-suited for tracking and controlling the spread of COVID-19 around the world.

The rapid, global spread of COVID-19 has brought advanced big data analytics tools front and center, with entities from all sectors of the healthcare industry seeking to monitor and reduce the impact of this virus.

Researchers and developers are increasingly using artificial intelligence, machine learning, and natural language processing to track and contain coronavirus, as well as gain a more comprehensive understanding of the disease.

In the months since COVID-19 hit the US, researchers have been hard at work trying to uncover the nature of the virus – why it affects some more than others, what measures can help reduce the spread, and where the disease will likely go next.

At the core of these efforts is something with which the healthcare industry is very familiar: Data.

James Hendler, RPISource: Xtelligent Healthcare Media

“This is, in essence, a big data problem. We’re trying to track the spread of disease around the world,” James Hendler, the Tetherless World Professor of Computer, Web, and Cognitive Science at Rensselaer Polytechnic Institute (RPI) and director of the Rensselaer Institute for Data Exploration and Applications (IDEA), told HealthITAnalytics.

At RPI, researchers are using big data and analytics to better comprehend coronavirus from a number of different angles. The institute recently announced that it would offer government entities, research organizations, and industry access to innovative AI tools, as well as experts in data and public health to help combat COVID-19.

“We’re working with several organizations on modeling and dealing with the virus directly using a supercomputer, and we’ve been creating some websites where we track all the open data and documents we can find to help our researchers find what they’re looking for,” Hendler said.

“We also have some work we’ve been doing in understanding social media responses to the pandemic. One project, in particular, has focused on tracking data from Chinese social media as coronavirus spread there in mid-January, and then comparing it to American data.”

Between recognizing signs and symptoms, tracking the virus, and monitoring the availability of hospital resources, researchers are dealing with enormous amounts of information – too much for humans to comprehend and analyze on their own. It’s a situation that is seemingly tailor-made for advanced analytics technologies, Hendler noted.

“There are several big data components to this pandemic where artificial intelligence can play a big role,” he said.

“One component is biomedical research. A lot of work is going on to try to develop a vaccine to find out whether there is any current drugs work against COVID-19. All of those projects require molecular modeling, and many of them are using AI and machine learning to map things we know about the virus to things in pharmacological databases and genomic databases.”

Several big-name organizations have launched projects like these – Amazon Web Services, Google Cloud, and others have recently offered researchers free access to open datasets and analytics tools to help them develop COVID-19 solutions faster.

“AI can eliminate many false tracks and allow us to identify potential targets. So instead of trying 100 or 1000 different things, we can narrow it down to a much smaller size much faster. That’s going to accelerate the eventual finding of the vaccine,” Hendler said.

Researchers are also leveraging AI to evaluate the effects of COVID-19 interventions on individuals across the country, Hendler stated.

“A second component has to do with natural language processing and social media. What can we extract from social media that can help our scientists? What can we learn about how people are bearing the burdens and stresses of the pandemic?” he said.

“With SARS and other outbreaks, we never really had to figure out how different social distancing techniques are impacting the spread in different places. You can’t just compare numbers, because there are a lot of other factors to consider. AI is very good at that kind of multi-factor learning and a lot of people are trying to apply those techniques now.”

At UTHealth, a team developed an AI tool that showed the need for stricter, immediate interventions in the Greater Houston area. And at Stanford University, researchers have launched a data-driven model that predicts possible outcomes of various intervention strategies.

Using big data and analytics tools of their own, Hendler and his team is aiming to do something similar.

“We have a lot of time-series data from China, we have information about airline transportation, and we have population models for each country. Now we’re looking at doing this in our own region, and seeing if we can track and predict the spread based on the kind of social measures taken within different regions,” he said.

“We want to prototype that in our region and then scale it up to the US, and then eventually, the world.”

AI can also help organizations draw on research from the past, applying this knowledge to present and future situations.

“A third area where AI can make an impact is in mining scientific literature,” Hendler said.

“In past years, you had hundreds of grad students reading papers and trying to figure out what was going on. At many universities, there’s a lot of effort to say, ‘What can we learn from what’s already been published?’”

While AI and other analytics technologies appear to be the best possible tools for assessing and mitigating a global pandemic, researchers can’t always access what they need to build these models.

“The ideal data is hospital data that would tell us who is experiencing certain impacts from the virus,” Hendler said.

“For example, one project we’d love to do would be to correlate environmental or genomic factors to the people who are getting advanced respiratory problems, which is what’s killing most people with this disease. Is there a genetic component to that? Is it something where environmental factors are some kind of comorbidity? But can we get that kind of data because of HIPAA restrictions.”

Instead, research teams should focus on extracting insights from the information they do have available, Hendler said.

“Information about how people are moving, the effect of travel restrictions or stay at home orders, how many people have what – that’s data we can get. The more details we can get, the better, and a lot of that data are starting to be shared because you don’t have to say who the people are, just where the people are,” he said.

The unprecedented impact of coronavirus around the world has sparked the need for unprecedented partnerships, and these collaborations will contribute significantly to finding viable solutions.

“In healthcare, academia, and industry are mostly set up for people to stay in their own lanes. But people are rapidly beginning to realize that attacking this problem is going to require a collaborative effort,” Hendler concluded.

“To make any real progress in this situation, you need to bring together people who understand the computation and AI, people who understand the biological and biomedical implications, and people who understand population models. It’s a very interdisciplinary problem, and to make any headway, we need the data and we need the team.”

Source: Jessica Kent


With the assistance of ERP software, many manufacturers in Vietnam and China remain active during this pandemic without accelerating the spread of the pandemic. This solution has proved its high value during this crisis as it allows workers to work from home and connection as well as internal/external communication to continue seamlessly.

Data V Tech is proud to be one of the leading ERP vendors in the Asia Pacific. We have implemented Epicor ERP for many businesses in Asia Pacific. For direct consultation, please feel free to contact us.

What is predictive analytics

Predictive Analytics: What Is It and How Can It Help the Federal Government?

Powerful data analytics tools can help agencies save money and make more informed decisions.

Predictive analytics tools allow the government to get ahead of problems before they waste money, harm IT systems or cost lives. Such data analytics platforms can provide agency leaders, IT leaders, and analysts with actionable insights they can use to enhance their missions, improve their cybersecurity, save money on maintenance costs and generally make more informed decisions.

Agencies can also take advantage of open data to glean insights for and from one another, or open up data to the public and give them the opportunity to do the same.

“From spotting fraud to combatting the opioid epidemic, an ounce of prevention really is worth a pound of cure — especially in government,” Deloitte notes in a report on predictive analytics in government. “Predictive analytics is now being applied in a wide range of areas including defense, security, health care, and human services, among others.”

What Is Predictive Analytics?

For years, federal agencies employed traditional statistical analytics software (SAS) to build predictive models, but those workers were usually sequestered into back rooms without access to policymakers, notes Andrew Churchill, vice president of federal sales at analytics firm Qlik. “But now data science is in vogue and it’s the cool job,” he says.

The most basic way to understand predictive analytics is to ask, “How do I take what I can clearly see is happening and begin to, through trained models, describe what will happen based on the variables that we are feeding the machine?” Churchill says.

Mohan Rajagopalan, senior director of product management at Splunk, notes that predictive analytics involves the ability to aggregate data from a variety of sources and then predict future trends, behaviors and events based on that data. That can include identifying anomalies in data logs and predicting failures in data centers or machines on the agency’s network. It can also be used to forecast revenues, understand buying behaviors and predict demand for certain services.

“The outcome of predictive analytics is the prediction of future behaviors,” Rajagopalan says.

Adilson Jardim, area vice president for public sector sales engineering at Splunk, says that predictive analytics exists on a spectrum. On one end is basic statistical or mathematical models that can be used to predict trends, such as the average of a certain type of behavior. On the other end are more advanced forms of predictive analytics that involve the use of machine learning, in which data models are asked to infer different predictive capabilities, Jardim says.

Some customers are ingesting up to 5 petabytes of data per day, and that data can be used to not only understand what has happened but what could or is likely to happen, he says.

Predictive analytics can be applied across “a broad range of data domains,” Churchill says.

Defining the Predictive Analytics Process

There are numerous elements of the predictive analytics process, as Predictive Analytics Today notes. Here is a quick breakdown:

* Define project: Agencies first must define the scope of the analysis and what they hope to get out of it.

* Data collection: Getting the data itself and mining it can be a challenge, according to Rajagopalan. One of the big challenges federal agencies and other organizations face these days is the volume, variety, and velocity of data. “A model in the absence of trustworthy, validated and available data doesn’t yield much of a result,” Churchill adds.

* Data analysis: Another core element of the process involves algorithms that can inspect, clean, transform and analyze data to derive insights and make conclusions.

* Statistics: Predictive analytics tools need to then use statistical analysis to validate the assumptions and hypotheses and run them through statistical models.

* Modeling: Another key element is the modeling that is used to define how the data will be processed to automatically create accurate predictive models, Rajagopalan says. The algorithms can be as simple as rules that can be applied to understand a particular situation or understand data in the context of a particular scenario. There are also supervised algorithms and models that use machine learning techniques to build hypotheses around trends in the data and constantly refine themselves based on the data they are presented with.

* Deployment: IT leaders then have the outputs of the model, such as visualization, report or chart. The results of the predictive analysis are then given to decision-makers.

* Model monitoring: The models are continuously monitored to ensure they are providing the results that are expected.

Before, Rajagopalan says, agencies had specialized units to apply SAS, but those models were expensive to create. The democratization and consumerization of data and of analytics tools have made it easier to create simple and succinct summaries of data that visualize outputs.

What Is Open Data?

Joshua New, formerly a policy analyst at the Center for Data Innovation and now a technology policy executive at IBM, tells FedTech that open data is the best thought of as “machine-readable information that is freely available online in a nonproprietary format and has an open license, so anyone can use it for commercial or other use without attribution.”

On May 9, 2013, former President Barack Obama signed an executive order that made open and machine-readable data the new default for government information.

“Making information about government operations more readily available and useful is also core to the promise of a more efficient and transparent government,” the Obama administration noted.

On Jan. 14, 2019, the OPEN Government Data Act, as part of the Foundations for Evidence-Based Policymaking Act, became law. The OPEN Government Data Act makes data.gov a requirement in the statute, rather than a policy. It requires agencies to publish their information online as open data, using standardized, machine-readable data formats, with their metadata included in the data.gov catalog. May 2019 marks the 10th anniversary of data.gov, the federal government’s open data site.

The General Services Administration launched the site with a modest 47 data sets, but the site has grown to over 200,000 data sets from hundreds of data sources including federal agencies, states, counties, and cities. “Data.gov provides easy access to government datasets covering a wide range of topics — everything from weather, demographics, health, education, housing, and agriculture,” according to data.gov.

Predictive Analytics Examples in Government

Federal agencies are using predictive analytics for a wide range of use cases, including cybersecurity. Specifically, agencies are using these tools to predict insider threats, Splunk’s Jardim says. The models look at users’ backgrounds, where they have worked, how often they have logged in to networks at certain times and whether that behavior actually is anomalous. The goal of such tools is to make a good prediction of whether the security events should be tracked by human analysts, Jardim says.

“You only want to surface the events that are very clear insider threats,” he says. “The analyst is focused on high-probability events, not low-probability events.”

Predictive analytics can also be used for agencies’ data center maintenance by applying algorithms to look at compute capacity, how many users are accessing services and to assess throughput for mission-critical applications, Jardim says. Such tools can predict when a particular server will become overloaded and can help agencies preempt those events to ensure users have access to vital applications.

The Defense Department can also use predictive analytics to ensure that soldiers have enough of the right munitions and supplies in particular theaters of war and enough support logistics. “Logistics and operational maintenance take on a life-or-death consequence if I cannot ship enough munitions or vehicles into a specific theater,” Jardim says.

Qlik’s Churchill says that a customer within the Army is using predictive analytics tools to build models that support force enablement and predict the capabilities that will be needed in the future and which capabilities will be diminished, as well as the capabilities that will be required if certain scenarios arise.

The Pentagon is also working on predictive analytics tools for financial management via the Advanta workflow tool, which has brought together roughly 200 of the DOD’s enterprise business systems, Churchill says.

“How can they use predictive models to understand the propensity to have to de-obligate funds from a particular program in the future?” Churchill says. “As I am evaluating the formulation and execution of budgets, technologies like this have the ability to help those decision-makers identify the low-hanging fruit. How do I put those insights in front of people that they wouldn’t have gotten before?”

Predictive maintenance is also a key use case, especially for vehicles and other heavy equipment. Models can ingest data such as the weather and operating conditions of vehicles, not just how many hours they have been running, to determine when they will break down, Churchill says.

Source: Phil Goldstein


Data V Tech is proud to be one of the leading ERP vendors in the Asia Pacific. We have implemented Epicor ERP for many enterprises and organizations, including those in defense, in Asia Pacific. For direct consultation, please feel free to contact us.

how artificial intelligence fights Coronavirus

Coronavirus: How Artificial Intelligence, Data Science And Technology Is Used To Fight The Pandemic

Since the first report of coronavirus (COVID-19) in Wuhan, China, it has spread to at least 100 other countries. As China initiated its response to the virus, it leaned on its strong technology sector and specifically artificial intelligence (AI), data science, and technology to track and fight the pandemic while tech leaders, including Alibaba, Baidu, Huawei and more accelerated their company’s healthcare initiatives. As a result, tech startups are integrally involved with clinicians, academics, and government entities around the world to activate technology as the virus continues to spread to many other countries. Here are 10 ways of artificial intelligence, data science, and technology are being used to manage and fight COVID-19.

1. AI to identify, track and forecast outbreaks

The better we can track the virus, the better we can fight it. By analyzing news reports, social media platforms, and government documents, AI can learn to detect an outbreak. Tracking infectious disease risks by using AI is exactly the service Canadian startup BlueDot provides. In fact, the BlueDot’s AI warned of the threat several days before the Centers for Disease Control and Prevention or the World Health Organization issued their public warnings.

2. AI to help diagnose the virus

Artificial intelligence company Infervision launched a coronavirus AI solution that helps front-line healthcare workers detect and monitor the disease efficiently. Imaging departments in healthcare facilities are being taxed with the increased workload created by the virus. This solution improves the CT diagnosis speed. Chinese e-commerce giant Alibaba also built an AI-powered diagnosis system they claim it is 96% accurate at diagnosing the virus in seconds.

3. Process healthcare claims

It’s not only the clinical operations of healthcare systems that are being taxed but also the business and administrative divisions as they deal with the surge of patients. A blockchain platform offered by Ant Financial helps speed up claims processing and reduces the amount of face-to-face interaction between patients and hospital staff.

4. Drones deliver medical supplies

One of the safest and fastest ways to get medical supplies where they need to go during a disease outbreak is with drone delivery. Terra Drone is using its unmanned aerial vehicles to transport medical samples and quarantine material with minimal risk between Xinchang County’s disease control center and the People’s Hospital. Drones also are used to patrol public spaces, track non-compliance to quarantine mandates, and for thermal imaging.

5. Robots sterilize, deliver food and supplies and perform other tasks

Robots aren’t susceptible to the virus, so they are being deployed to complete many tasks such as cleaning and sterilizing and delivering food and medicine to reduce the amount of human-to-human contact. UVD robots from Blue Ocean Robotics use ultraviolet light to autonomously kill bacteria and viruses. In China, Pudu Technology deployed its robots that are typically used in the catering industry to more than 40 hospitals around the country.

6. Develop drugs

Google’s DeepMind division used its latest AI algorithms and its computing power to understand the proteins that might make up the virus, and published the findings to help others develop treatments. BenevolentAI uses AI systems to build drugs that can fight the world’s toughest diseases and is now helping support the efforts to treat coronavirus, the first time the company focused its product on infectious diseases. Within weeks of the outbreak, it used its predictive capabilities to propose existing drugs that might be useful.

7. Advanced fabrics offer protection

Companies such as Israeli startup Sonovia hope to arm healthcare systems and others with face masks made from their anti-pathogen, anti-bacterial fabric that relies on metal-oxide nanoparticles.

8. AI to identify non-compliance or infected individuals

While certainly a controversial use of technology and AI, China’s sophisticated surveillance system used facial recognition technology and temperature detection software from SenseTime to identify people who might have a fever and be more likely to have the virus. Similar technology powers “smart helmets” used by officials in Sichuan province to identify people with fevers. The Chinese government has also developed a monitoring system called Health Code that uses big data to identify and assess the risk of each individual based on their travel history, how much time they have spent in virus hotspots, and potential exposure to people carrying the virus. Citizens are assigned a color code (red, yellow, or green), which they can access via the popular apps WeChat or Alipay to indicate if they should be quarantined or allowed in public.

9. Chatbots to share information

Tencent operates WeChat, and people can access free online health consultation services through it. Chatbots have also been essential communication tools for service providers in the travel and tourism industry to keep travelers updated on the latest travel procedures and disruptions.

10. Supercomputers working on a coronavirus vaccine

The cloud computing resources and supercomputers of several major tech companies such as Tencent, DiDi, and Huawei are being used by researchers to fast-track the development of a cure or vaccine for the virus. The speed these systems can run calculations and model solutions is much faster than standard computer processing.

In a global pandemic such as COVID-19, technology, artificial intelligence, and data science have become critical to helping societies effectively deal with the outbreak.

Source: Bernard Marr


With the assistance of ERP software, many manufacturers in Vietnam and China remain active during this pandemic. This solution has proved its high value during this crisis as it allows workers to work from home and connection as well as internal/external communication to continue seamlessly. Thus, it indirectly reduces the spread of the virus.

Data V Tech is proud to be one of the leading ERP vendors in the Asia Pacific. We have implemented Epicor ERP for many businesses in Asia Pacific. For direct consultation, please feel free to contact us.

question ERP vendor

3 Questions to Ask Your ERP Vendor About Data Management

Throughout the process of evaluation ERP software solutions, you will have plenty of questions about functionality. You will undoubtedly take guided product tours, looking to match the software’s capability with your unique business process. An ERP solution, however, is a long-term commitment. As you move through the selection process, it will become increasingly important for you to think about long term benefits. ERP data management should be near the top of your list.

But what are the right questions to ask about data management? How can you elicit direct, meaningful responses from ERP vendors to help inform your decision? The following three questions should be central to your conversation with each vendor.

Who owns my data?

Especially in the cloud ERP world, concerns about data security have grown in recent years. When you trust your data to products in the cloud, you should feel secure that your company is the sole owner of its data. In an ideal scenario, vendors should not have access to your information stored on managed servers. It is the vendor’s job to provide the software and secure the data— that should be the extent of their involvement.

When you ask this question, most vendors are likely to answer that, of course, you are the owner of your data. Most vendors will also direct you to their privacy policy or security documentation. While important, those documents aren’t everything.

You should press the vendor further. Ask this question directly: “For what purposes do you use my data?” The answer should be a resounding no. Simply put, there is no substitute for hearing your vendor say that to you directly.

If I decide to leave your software, how can I take my data with me?

Notice the wording of the question. Being able to leave a solution with your data should be a given in this scenario. However, the vendor should also have both policy and process in place for exporting your data from their system. Preferably, your information can be converted to a universally exportable file (such as a .csv). It can then be imported into another ERP system.

The other factor in leaving ERP software is data migration. To what extent will the vendor help you get that information out of their system?

As with the previous question, it is a good idea to ask this directly. Unless there are mitigating circumstances, a good ERP vendor will help you transition away from their product. Ideally, this service should come at no additional cost. If there are costs involved, you should know that upfront.

While many ERP projects are a great success, they don’t always turn out as planned. Experienced vendors know this and will be ready to help when needed.

How is data restricted from (or permitted for) different users?

Many ERP systems grant permissions based on tiers. While that process may be a simple click on the front end, it can become difficult for the company’s operations. Many organizations have scenarios where employees are granted permissions to 95% of the documents related to their job function. It’s the other 5% that makes things difficult.

When employees have to either request access to other documents or send files to another employee, they waste valuable process time. After all, the data flow is one of the problems ERP systems are often created to solve.

Ask your vendor if permissions can be granted on a more granular level. Once customized, every employee who uses the system can access exactly what they need, while being restricted from what they don’t.

As discuss ERP solutions with vendors, it’s important to understand the basics (features, support, etc.), but data management should be an essential part of your selection process. Pursued in detail, you will feel much more comfortable getting the answers you need from vendors.

Source: ERP Solutions Review

Come to Data V Tech where is the talented ERP consultant hub, we will help you choose the most suitable ERP solution for your own business.

Visit our website for further information about ERP and information and technology in general.

big data in erp

Big Data in ERP: Leveraging for support

Big data in enterprise resource planning – ERP – is defined as larger, more complex data sets. Especially from new data sources. These data sets are huge, to the point where traditional data processing software can’t manage them completely. For manufacturers, leveraging big data with ERP systems can be used to help solve certain persistent business problems. Not only does big data offers a huge amount of support to improve visibility and performance. It also improves your ERP system from sales forecasting and scheduling to enhanced quality control, and more.

Scheduling

Big data captures the information needed for all types of scheduling, and is more immediate and readily available, due to today’s expansive collection of delivery devices. Having real-time feedback from your ERP system can give manufacturers a leg up on scheduling efficiency and will ultimately lead to comprehensive ERP scheduling that, in turn, will create better overall project management efficiencies.

Quality Assurance

Predictive capabilities of big data can extend to product quality assurance as well. It allows manufacturers to channel, store and monitor every real-time data point along a production line in order to create better results at the work-in-progress phase rather than having to deal with problems only after a product hits the quality assurance floor.

Supply Chain

Integrating big data analytics in processes and operations leads to greater efficiency in the supply chain. According to Accenture’s report, Big Data Analytics in Supply Chain, the consultancy found that using big data within ERP systems instead of on an ad hoc basis led to 1.3 times the supply chain speed.

Having the ability to keep track of all the moving parts within the supply chain is a huge advantage. Big data improves visibility to each step of the supply chain process. Furthermore, it gives businesses a 360-degree view of where all of their assets are at any certain time. Big data improves supply chain reaction time by 41 percent, according to Accenture. When there is a product issue, the mix of big data and ERP systems can help ensure to handle it quickly.

Sales Forecasting

When combined with ERP systems, big data can help businesses predict demand for specific items. Individuals have the ability to track and trap customer trend patterns in real-time. They then can immediately apply that focused data to create further direct sales offers. For instance, a retailer could use big data to analyze how the release of a new iPhone model affects the sales of headphones and computer peripherals.

Posted on  by Elizabeth Quirk in Best Practices

Click here for further information about big data in Epicor ERP – ERP solution for manufacturers or contact us for direct consultation. 

Thank you!

data activities for erp

Data Activities for ERP Solution Implementation

Why Your Data Activities Need to Start Long Before You Bring on an ERP Solution Provider?

Whether you are migrating to an ERP system for the first time, or the nth time, you need to do certain data activities. Here are 5 reasons why you should put significant pre-project effort into streamlining your material and material-plant combinations.

Data quality will improve

When an organization puts a given amount of effort into their data preparation with a focus on a smaller data footprint, the relative effort per material-plant combination will increase. Thus, the overall data quality will improve.

Ongoing data maintenance will decrease

If you have a smaller data footprint, the ongoing data maintenance will continue to be smaller going forward.  Consider, for example, how every manufactured material requires a BOM for every plant.  Of course, if multiple plants routinely manufacture a material, then you should set up and maintain the material-plant combinations. All require BOMs.  If, however, you are contemplating setting up a material-plant combination because “one day we just might want to manufacture this part at another plant”, I caution against that.

I have seen far too many “maybe” scenarios never happen. The data maintenance team is saddled with the creation and ongoing maintenance of never-used data elements for the life of the ERP system.  This is a tremendous waste of valuable data maintenance resources.  They should be laser-focused on the critically important data rather than diluting their efforts with the “might never happen” scenarios.

Then you will see the “skeletons in the closet”

To properly streamline materials and material-plant combinations it is necessary to thoroughly investigate and characterize where and how you use materials.  The fact that a given material has inventory means something must be done with it either before or during migration.  If the inventory is healthy and active, then setting up the materials and migrating the inventory to the new system makes sense.  If the inventory is inactive and/or obsolete, however, then the organization must decide what to do with it.

Can the product be reworked and sold, or must it be disposed of?  Equally important is understanding how it came to be.  Was this the result of an over-make by Manufacturing? A poor forecast by Sales?  A canceled sales order?  A quality issue?  Have we found the solution to the underlying cause, or is it continuing?  Is this symptomatic of a larger underlying issue?  Understanding and addressing these sorts of questions will provide great insights into both the organization’s capabilities and mindsets.  This is often a rich source of improvement opportunities and organizations are wise to address these deficiencies early and often.

Product portfolio improvements naturally follow

In characterizing materials, it is helpful to define the specific attributes used to identify individual products.  In doing so, it is common to find that two or more materials have (mostly) the same attributes.  This implies that there are either redundant materials or that some important and distinguishing attributes are missing.  Regardless of which situation it is, just the mere process of chasing down that answer often brings about an enlightened understanding of the product portfolio and leads to questions such as “why do we have so many materials that appear to be so close in fit, form, and function?”  Addressing that question satisfactorily can take months. Thus, you should finish it before the formal ERP project kicks off.

It will reduce operational mistakes

Most ERP systems have advanced search functionality that enables users to rapidly find materials, plants, storage locations, and the like.  Unfortunately, most ERP systems don’t have a foolproof mechanism to help users discern between material-plant combinations genuinely intended for use and those set up “just in case someday we might want to use it”.  Therefore, having both flavors of choices available to users will possibly result in mistakes.  The best way to avoid these types of mistakes is simply to not even set up non-intended options in the first place.

Contact us here for the best ERP consultancy

Source: ERP news

data analytics

IT Trends and ERP

Enterprise Resource Planning (ERP) in particular has a lot to gain from adopting an open approach to new innovations. Here are the five technology trends that have shaped the development of ERP:

1.The Internet of things

The Internet of Things (IoT) is a concept that provides objects, such as cars and electrical appliances, with the capacity to transfer data over a network without requiring human interaction.

What is the Internet of Things (IoT)? | IT PRO

Thus, the integration of IoTs into ERP can help businesses grow to a new level. For example, you can then easily access information such as location, usage, and performance. As a result, ERP allows organizations to identify issues, i.e., locate unused assets and detect maintenance needs.

2.Wearable technology Best Wearable Tech from CES 2019 | Tech.co

While much of the attention generated by wearables has focused on consumer propositions like fitness trackers, there are also a host of applications in the workplace. Augmented Reality enabled glasses like Google Glass will enable hands-free operations which can be of great benefit for many blue-collar workers.

Smartwatches are more easily accessible and are less prone to ‘accidents’ such as misplacement. Thus, they represent an advance in comparison to PDAs and smartphones. Devices that monitor external factors like UV exposure or heat can help improve the management of employee health.

3.Big data analytics

Organizations have become more dependent on IT. Therefore, they have accumulated a wealth of data that has been traditionally underutilized. As the IoT connects tools and employees to the internet, this data generation will grow exponentially.

How Big Data Analytics Solving Product Promotion Issues

By employing analytical tools, organizations can begin to use this data to make accurate predictions that form the basis of a more intelligent approach to business strategy.

4.The age of context

With businesses increasingly operating in a multichannel world, understanding the context is crucial to your performance. ERP software developers are working hard to find the best ways to show you:

  • the situation you’re in,
  • what information you would like to see, and
  • how you would like to see it.

PCs and mobile apps will increasingly integrate context-aware functionality to anticipate user needs and improve the efficiency of day to day tasks.

For example, a field service engineer will automatically receive all the asset data, job instructions, and customer relationship history as soon as they arrive at the repair site.

5.Opening business to innovation Creating a Culture of Innovation in Business

Over the next few years, technology like wearables, the IoT and big data analytics stand to reinvent business processes across many different industry sectors. Organizations need to keep an eye on technological advances, even those that may seem to be irrelevant.

  Source: Industry Week