Artificial intelligence and data capture: technologies that make the pair!

From Siri and Alexa to chatbots or robot traders, artificial intelligence has fundamentally changed many aspects of how we work and data capture is no exception.

Close your eyes and imagine depositing your invoices in a scanner, leaving and letting your computer archive / sort them so that you have only the “exceptions” to process before paying the bills. Do you think this is a dream still far from being realized? Not so sure.

Did you know ? Really intelligent capture software does not require templates, keywords, exact definitions, classifications or indexes to do a good job. Indeed, it can extract the right information and give meaning to a multitude of documents alone, whatever their size, format, language or symbols used.

Three ways in which artificial intelligence modifies data capture

With intelligent capture software, the AI-based “engine” can learn – like a new employee – to perform data entry. It can quickly extract contextual information and learn to interpret the patterns and characteristics of different types of documents. In addition, it can validate the data and provides additional protection, which employees can not achieve without tedious manual searches.

Intelligent data capture has changed the game for three main tasks: classification, extraction and validation.– Classification

With the classification, also called “sorting of documents”, the software learns to recognize different types of documents (when the user “teaches” some variations and examples). The automatic learning engine reduces the number of rules to be applied, which gives a high level of confidence in the classification of documents with a minimum of manual effort.– Extraction

Artificial intelligence has worked wonders for extracting data in semi-structured and unstructured documents. For example, consider identifying the invoice number, which typically involves creating complex templates, keywords, and links around specific domains and labels. A new employee can view a document and immediately locate invoice numbers, regardless of the form’s form. Now, software can do it too without the need for programming.– The validation

AI-driven research extends research with different tools. Thus, it can use different sources of information (such as an example, quantity, price, description, or amount) to link an article to the system database.

Working in tandem: intelligent capture and automation of robotic processes

The market for RPA (Robotic Process Automation) is booming. So far, it is delivering on its promise to automate complex, rule-based processes. Forrester expects a global market – with only a fraction of document capture – worth $ 2.9 billion in 2021, compared with just $ 250 million in 2016 (10 times more growth in five years).

In other words, the system itself becomes smarter.

In addition to the obvious advantages of automation, the use of intelligent data capture software also eliminates conjectures on the configuration side. It is important to note that the goal of AI-based data capture is not to replace humans, but to drive as much automation as possible with machines that can intelligently perform tasks. In the end, employees are freed from mundane tasks and can take on valuable tasks that require a human spirit to do things right.

In a world where information and documents are constantly changing, any company that wants to be successful must learn and adapt – ideally with technology that does the same thing.

How Facebook put the AI ​​at the heart of its social network

Detection of inappropriate content, ranking newsfeed, facial recognition … The platform uses massive machine and deep learning.

Artificial intelligence (AI) is present on all levels of the social network Facebook . At the heart of newsfeed, it prioritizes content based on users’ interests, their consultation history, their social and relational graph. Similarly, it allows them to push advertisements to which they have a high probability of joining, from the click to the purchase. More difficult, the AI ​​is also exploited by the platform to detect unauthorized connections or false accounts. Finally, it intervenes to orchestrate other tasks, less visible, but remain key in the daily social network: customize the ranking of search results, identify friends in photos ( by facial recognition) to recommend tagging them, manage speech recognition and text translation to automate captioning videos in Facebook Live …

Target functionAlgorithm familyType of model
Facial Recognition, Content LabelingDeep learningNetwork of convolutional neurons
Detection of inappropriate content, unauthorized access, classificationMachine learningGradient Boosting of Decision Trees
Customization of newsfeed, search results, advertisementsDeep learningMultilayer Perceptron
Natural language comprehension, translation, speech recognitionDeep learningNetwork of recurrent neurons
Matching usersMachine learningSupport vector machine
Source: Facebook search publication

The social network makes extensive use of standard machine learning techniques. Statistical algorithms (classification, regression, etc.) adapted to create predictive models starting from encrypted data, for example in order to predict changes in activity. Facebook uses it to find inappropriate messages, comments and photos. “Artificial intelligence is an essential component for protecting users and filtering information on Facebook,” insists Yann Lecun, vice president and scientific director of the group’s AI. “A series of techniques are used to detect hateful, violent, pornographic or propaganda content in images or texts, and, on the contrary, to label the elements likely to

Multiple neural networks

Alongside traditional machine learning, deep learning is obviously implemented by Facebook. Based on the concept of artificial neural network, this technique applies to both numbers and audio or graphic content. The network is divided into layers, each responsible for interpreting the results from the previous layer. The IA thus refines by successive iterations. In text analysis, for example, the first layer will deal with the detection of letters, the second of words, the third of noun or verbal groups, and so on.

Unsurprisingly, Facebook applies deep learning to facial recognition, particularly via convolutional neural networks, a particularly efficient algorithm for image processing. Via the multilayer perceptron method, ideal for managing a ranking, the social network uses it to customize the newsfeed or the results of its search engine. Finally, deep learning is also powered by Facebook for machine translation.

To recognize embedded text in images, Facebook uses a triple layer of neural networks. © Capture / JDN

Around image processing in particular, Facebook has built a deep learning platform called Rosetta . It is based on a triple network of neurons (see screenshot above). The first, of a convolutive nature, maps the photos. The second detects the zone or zones containing characters. As for the latter, he tries to recognize the words, expressions or sentences present within the identified regions. A technique known as Faster R-CNN that Facebook has slightly adapted to his needs. Objective: better qualify the images posted to optimize indexing , whether in the newsfeed or the search engine .

A deployment pipeline

To set its different AIs to music, the American group has a homemade pipeline. Code name: FBLearner (for Facebook Learner). It starts with an app store. For internal teams of data scientists, it federates a catalog of reusable functionalities as well for the training phases as for the deployment of the algorithms. A workflow brick is then added to manage the training of the models and evaluate their results. Training processes can be performed on computational clusters (CPUs) as wellonly clusters of graphics accelerators (GPUs), again designed internally. Last stone of the building, an environment motorized the predictive matrices, once these trained, in situ at the heart of the applications of Facebook.

Workflow built by Facebook to develop and deploy its machine models and deep learning. © Capture / JDN

Side libraries of deep learning, Facebook has historically chosen to develop two. Each is now available in open source. Designed for its basic research needs, the first, PytTorch, is characterized by its great flexibility, its advanced debugging capabilities and especially its dynamic neural network architecture. “Its structure is not determined and fixed, it evolves as learning and training examples are presented,” says Yann Lecun. The downside: the Python execution engine under the hood makes PytTorch inefficient on production applications. Conversely, the second deep learning framework designed by Facebook, Caffe2, was precisely designed to be the product deployments. In this context,

More recently, Facebook has set up a tool called ONNX to semi-automate the formatting for Caffe2 of models originally created in PyTorch. “The next step will be to merge PyTorch and Caffe2 into a single framework that will be called PyTorch 1.0,” says Yann Lecun. “The goal is to benefit from the best of both worlds, and result in a flexible infrastructure for research and efficient compiling techniques to produce usable AI applications.”

Towards the design of a processor cut for the AI

Following the model of Google’s Tensorflow Processor Units (TPU) cut for its Tensorflow deep learning framework, Facebook is also planning to develop its own optimized chips for deep learning. “Our users upload 2 billion photos a day to Facebook, and within 2 seconds each is handled by four AI systems: one manages the filtering of inappropriate content, the second the labeling of images in order their integration with the newsfeed and the search engine, the third the facial recognition, finally the last generates a description of the images for the blind, all consumes gigantic computing and electrical resources.With the real-time translation of video that we are now looking to develop, the problem will intensify.

OVH shapes infrastructure cloud for AI

Optimized for the machine and deep learning, the IaaS aims to offer a whole range of complementary services to manage the training and deployment pipelines.

OVH’s strategy in artificial intelligence is taking shape. First stage of the rocket, the French cloud intends to build a computing infrastructure tailored to machine learning. An IaaS that is both optimized in terms of network performance, computer computation (CPU) and graphical acceleration (GPU). Second floor: design virtual or bare metal servers that respond to the most common uses of AI. Lastly, OVH plans to offer a series of managed cloud services to facilitate the deployment of machine learning pipelines on its infrastructure. A strategy that has the merit of being clear and precise.

From 2017, OVH delivered its first GPU instances on its public cloud (OpenStack) with machine learning among the main targeted use cases. On the occasion of its last customer event in October 2018, the Roubaix group completed the building of Nvidia Tesla V100 GPU virtual machines shaped to accelerate the learning phases of neural networks. “In the coming days, we will also market Flash-based NVMe storage applications targeting intensive applications,” said Alain Fiocco, chief technical officer of OVH. For companies preferring a dedicated training environment,

To top it all off, OVH has just announced support for Nvidia GPU Cloud (NGC) technology through its Nvidia Tesla V100 GPU instances. It opens to its customers access to a catalog of machine learning libraries (Caffe2, MXNet, PyTorch, TensorFlow …), all optimized for the graphics processors of the American foundry. Available as containers, these pre-integrated frameworks embed the necessary bricks for their execution, from the Nvidia Cuda environment to the OS via the Nvidia libraries.

Best of all, NGC software is also compatible with OVH’s offer of the DGX-1 dedicated beta server . Equipped with 8 graphics processors, this Nvidia multi-GPU calculator targets the intensive training needs of deep learning. “This offer allows us to test the appetite of the market for this type of configuration.If there is a sponsor, we could consider building our own multi-GPU machine,” says Alain Fiocco.

To the question of whether OVH could go so far as to design its own graphic processors designed for deep learning, like Google with its TPU, the technical director of OVH responds in the negative. “Our mission is not to manufacture chips, but rather to assemble servers from market components to achieve a price / performance / density ratio that makes the difference.” A way that Facebook already borrows for its internal needs with physical machines GPU eight hearts homemade. As for the rest of its infrastructure, OVH already leans its VM and metal bar solutions for AI to servers designed by its Roubaix R & D and assembled in its Croix factory a few kilometers away.

Alain Fiocco is CTO of OVH. © OVH

In parallel, OVH intends to capitalize on its developments in internal AI to offer its customers new products. Example of this approach: the machine learning platform offered on its Labs (in alpha version) comes from an internal project centered on the predictive analysis of the life cycle of its IT infrastructures. “We have decided to extend it to make it generalizable and to respond to use cases from other business entities, since then we have also been using this application for fraud detection,” explains Alain Fiocco.

From there to the packager and market it as a cloud service, there is one step. “In the same vein, we could in the future benefit our customers from our predictive models for managing IT capabilities,” the CTO adds.

A Spark service tested in the Labs

Another illustration of this logic of conversion of internal bricks in the form of products: FPGA processors (for Field-Programmable Gate Array). Historically, OVH has used these reprogrammable chips as part of its system to fight against denial of service attacks (read OVH’s post on the subject ). The latter leans on FPGA servers assembled, again, by the group’s teams. “We could definitely consider marketing them if the need arises among our customers,” says Alain Fiocco. In its Labs, OVH also offers (in beta) a PostgreSQL database acceleration service that already takes advantage of these FPGA machines.

In total, OVH has deployed a team of about twenty people dedicated to its R & D projects in data science and artificial intelligence (excluding business intelligence). Alongside the initiatives mentioned above, she is working on other experimental AI projects available on the OVH Labs. This is the case, for example, with an image recognition engine or an Apache computing cluster cloud service.Spark. Directly based on the company’s OpenStack public cloud infrastructure, it allows you to train machine learning models by backing up the SparkML library. On the price side, these managed cloud solutions will initially be made available free of charge. Only the underlying machine resources (virtual or bar metal) and actually consumed by the customer will be billed.

Among his first references on the field of AI, Octave Klaba’s company highlights Systran. The text-based translation expert uses his NVIDIA DGX-1 servers to orchestrate his intensive calculations of neural networks applied to linguistic processing.

Microsoft wants to train at Linux

In partnership with the Linux Foundation, the publisher launches a certification training focused on Linux. It aims to support computer scientists wishing to use the OS on its cloud.

Microsoft’s new initiative in open source. In partnership with the Linux Foundation, the publisher launches a certification focused on Linux . It aims to learn how to exploit the open source OS on the cloud of the US group.

This initiative is not necessarily a surprise. Microsoft is now focusing much of Azure’s technology policy on open source . In total, nearly fifty applications and open source bricks are, for the time being, already implemented on the cloud.

A strategy that aims, also, to respond to a request. Microsoft recently revealed that a quarter of instances operated on Azure for clients embark Linux .

A certification consisting of two modules

No wonder Microsoft also wants to support this movement with a new training offer. The proposed certification involves passing two modules. One provides Microsoft Specialist certification “Implementing Azure Infrastructure Solutions” (test 70-533). The second refers to the LFCS certification exams of the Linux Foundation on Linux System Administration . These modules are offered by approved training centers.

“Microsoft likes Linux”. 
This is the message that Satya Nadella, the CEO of Microsoft, now intends to pass. 
A positioning that was unveiled for the first time in October 2014 .  
Credit: Microsoft

In total, Microsoft supports seven Linux distributions on Azure  : Ubuntu, CentOS, SUSE Linux Enterprise, OpenSUSE, Oracle Linux, as well as Debian and Red Hat – whose support in the publisher’s cloud has been introduced more recently (read article: Red Hat becomes Microsoft’s weapon to sell Azure to linuxians ).

End of Unlimited Storage in OneDrive for the General Public: Microsoft Responds to User’s Grumbling

Faced with the outcry that followed the announcement of the end of unlimited storage in OneDrive for Office 365 Family, Personal and University subscribers, Microsoft is making arrangements.

In October 2014, Microsoft launched a no-limit file storage offeringon OneDrive for Office 365 Family, Personal, and University subscribers. Evoking abuse, he announced a little over a year after wanting to return to a limitation from January 2016, fixing it at 1 TB. The publisher also put an end to its OneDrive packages of 100 and 200 GB, with the willingness to substitute for an offer capped at 50 GB for $ 1.99 per month. As for its free entry-level OneDrive service, it went from 15 GB to 5 GB. Microsoft gave 12 months to Office 365 subscribers who had benefited from its unlimited offer to clean (up to a limit of 10 TB used) , and return to 1 TB.

While a petition calling for the return of the unlimited offer currently collects more than 72,000 signatures , Microsoft is forced to rearrange its proposal. The group is committed first and foremost to fully repay its customers who would not be satisfied with the conditions of release of its Office 365 unlimited storage offer. For all users of free OneDrive accounts, it proposes to keep the storage capacity of 15 GB (with possibly the additional extra 15 GB that could be proposed) by going to this activation page (valid until as of January 31, 2016). “As for users of free OneDrive account with more than 5 GB of stored content, they are offered a free one-year subscription to Office 365 Personal with 1 TB of storage included (this offer will be sent by email in early 2016)” , recalls Microsoft.

Finally, it is clear that the Office 365 roadmap still indicates (as we write these lines) “an upcoming introduction of unlimited storage in OneDrive for Business”.

How unified communications meets the need for business continuity

Failure to operate due to a failure of the communications means leads to delays in customer service and response to customer needs, resulting in decreased customer satisfaction.

Malfunctioning essential services, such as communications or software solutions, translates very quickly into lost revenue and often a tainted reputation. Failure to operate due to a failure of the communications means leads to delays in customer service and response to customer needs, which inevitably leads to decreased customer satisfaction.

Many CIOs now have to focus on business continuity and communication. Yet a recent survey of 1,000 IT managers and published by Computer Weekly recently revealed that 96% of companies are not confident in their ability to communicate when faced with a disaster. In addition, 30% also indicated that they did not have communications continuity plans.

Yet, simple solutions are available since with a cloud-based unified communications (UC) system, an organization can dramatically improve business continuity by leveraging off-site solutions that have been rigorously tested to provide uptime. 99.999%.

Definition of business continuity

The Business Continuity Institute (BCI) defines business continuity as “the ability of the business to continue delivering products or services to acceptable pre-defined levels after a disruptive incident.” “

In practice, the term describes the creation of plans to provide critical business functions, despite disasters, emergency incidents and other external factors that would otherwise disrupt the ability of the business to function.

Frequent Obstacles in Continuing Business Continuity

Providing ongoing communications services that would otherwise be disrupted requires a comprehensive and in-depth plan to look at all facets of the organization and identify ways to provide critical functionality. It is an ongoing process of analysis, design, implementation and validation.

Organizations often face multiple obstacles in pursuit of seamless business continuity:

  • ·         Poor understanding of the organization .

Failure to properly understand the continuity requirements of an organization in the event of a disaster can result in the construction of a business continuity plan that falls flat. This problem must be solved by working directly at all levels of management when creating the program and meeting their unique needs.

  • ·         Do not incorporate business continuity into the corporate culture .

The corporate culture dictates where employees place their value and attention. The creation of tools for business continuity must be integrated into this corporate culture to be truly effective. Encouraging all employees to communicate using a personal unified communications solution in the cloud can create the reflex of using a tool designed with business continuity in mind .

  • ·         Following a model rather than a guide .

There are several methods and several effective business continuity programs. Many companies invest in these programs to save time. Some of these programs are worth the investment, while others result in the implementation of a plan that fails when it is needed. The solution ? Look for a guide to help you develop a plan. Find a consultant or program that favors analysis instead of ticking boxes. 

Unified communications as an emergency response tool

Cloud-based telephony solutions, the prevalence of Bring Your Own Device (BYOD) programs, and advances in cloud hardware all contributed to the emergence of unified personal communication as a perfect emergency response system.

Personal unified communication provides an effective experience for all users across all platforms and operating systems. It offers better communication and collaboration during normal operations. When emergencies occur and traditional onsite solutions fail, unified communication is maintained. Since the services are hosted offsite, they will not be impacted by the disaster.

When the system is in recovery mode, employees can use a device to communicate with each other. Even if mobile towers are removed throughout the region, any device with an Internet connection will be able to communicate with the entire organization. Cloud-based telephony solutions help businesses stay up-to-date when faced with snowstorms, fires, or technical failures

Societe Generale: investment bank moves to cloud 2.0

After a proof of concept, Societe Generale’s Corporate Banking and Investor Solutions Bank decided to embark on the creation of a Docker-oriented private cloud.

The Société Générale was present at the DockerCon Europe in Barcelona on 16 and 17 November. Sponsor of this second edition of the Docker client event on the Old Continent, the bank was the only actor outside the IT sector to hold a stand – alongside many partners of the US publisher. A way for the French group to highlight its interest in Docker’s innovations in the cloud. Is this surprising? Not that much, when we know the history of this bank on digital , and its policy of openness to the digital ecosystem for about two years. Organization of Hackathons, active contributions to technological meetups, involvement in several major meetings on digital ( Competition of the Best Dev of France , Devoxx, Cloud Computing World Expo) … Societe Generale’s participation in the DockerCon Europe is the last initiative of an already long list.

A Pock on Docker that proves conclusive

But behind the presence of Société Générale at the DockerCon Europe also hides, of course, an interest of the group, for Docker. A few months ago, a project of proof of concept (Poc) on this technology was initiated within its branch Large Client Bank and Investor Solutions (GBIS). Alongside the retail bank, it is one of the two main divisions of Société Générale. It includes the corporate and investment banking, asset management, private banking and investor services. “Our Poc on Docker was conclusive,” says Adrien Blind, DevOps coach & Disruptive agitator at GBIS, whom we met at the DockerCon.

Societe Generale stand on the exhibition area of ​​the DockerCon Europe 2015 which was held on November 16th and 17th in Barcelona. © JDN

During this phase of Poc, three cases of use of Docker were defined by the bank. First, this technology is identified as a lever to optimize continuous deployment. Because of its lightness, the container would allow GBIS to manage its production more quickly than conventional VMs. Second case of use in the line of sight of GBIS: the management of the climbs in charge. With its capacity to provision images almost on the fly, the container offers a level of efficiency on this ground, again, well above VM. “In a business environment like the financial markets, some applications have a fluctuating workload, so the size of their computing resources must be able to adapt dynamically throughout the day;

Towards a first Docker platform in production

As a result, the next challenge for GBIS is to build a Docker platform to internally host applications in production. To build such a private cloud, GBIS plans to use one of the flagship technologies positioned in this field. CloudFoundry and OpenShift (Red Hat) are examples. But GBIS will also have to deal with its existing. This division of Société Générale has indeed deployed a vast private cloud based on the VMware offer.

With Docker, the production and development teams will be able to speak the same language

At the same time, GBIS production and development teams will also have to tame Docker to make the most of the benefits. A small challenge in accompanying change. “It would be precisely the role of IT coaches, and especially our DevOps coaches, to deal with this issue,” continues Adrien Blind. “Before being coaches, they often have experience that is close enough to the professionals they now support, which gives them legitimacy, and allows them to speak the same language, using examples that speak to teams.”Finally, the containers could well contribute to move the GBIS continuous deployment process into a new era. Docker technology is a way to build a bridge between GBIS’s IT infrastructure management and application development activities. Missions currently assigned to two very different ISDs. “With Docker, it’s a new point of convergence that is emerging,” says Adrien Blind. Thus, the ISD in charge of the infrastructure could typically ultimately provide basic application images, no longer in the form of “code artefacts”, but containers … Containers on which the development teams could then carry the business applications.


Marketing: Salesforce Offers MinHash Data Scientists

The tenor of the CRM has just acquired a very young shoot that proposes to improve the relevance of marketing campaigns through machine learning.

Salesforce has confirmed the acquisition of MinHash, a young Californian startup that wants to put machine learning in the service of marketing. Its platform, and virtual assistant Aila, promise more specifically to detect emerging trends on the internet, and to propose accordingly campaigns more targeted and more relevant. The start-up was founded in March 2014 by data scientists who previously worked for eBay and Avaya.

The amount of the acquisition has not been disclosed. On its website , the Minhash team explains how to join Salesforce in order to continue its machine learning efforts “on a much larger scale, at the world’s leading CRM”. Some people are already imagining that Salesforce will use this acquisition to build a new Big Data or marketing analytics offering, which has been pending for some time (read our article What Salesforce Will Do With Its Analytics Solutions ). Salesforce had already offered a virtual assistant (RelateIQ, for nearly $ 400 million), which it has now integrated into its offer for SMEs and small businesses (read SalesforceIQ: to add intelligence to the customer relationship ).

The platform MinHash will in any case stopped on January 21, is it now written on the company’s website.

Cloud: VMware Terminates EMC Joint Venture Project

The virtualization specialist announces its withdrawal from the joint venture project it announced with EMC. Through this joint venture, the two players wanted to give birth to a hybrid cloud actor.

In a document sent to US securities regulators (SEC), VMware announces that it is pulling out of its joint venture project with EMC. The virtualization specialist and its parent company formalized this joint initiative in October . Its goal: to create a hybrid cloud infrastructure actor, capable of offering a combined public cloud (IaaS) and private offer. Incorporating some of WMware’s business, it was to be carried by Virtustream, a private cloud company acquired by EMC in May 2015. The joint venture was to take this brand as its name.

How to explain this change? Since announcing the acquisition of EMC by Dell , the work VMware has been declining, from 82 to $ 57 (its course this December 14 at closing). Although  VMware is 80% owned by EMC, the company has indeed a separate management and its own during scholarship . During the last two months of the VMware stock retreat, the most significant dropout took place on October 21st, the day of the announcement of the joint venture with EMC .Beyond the wrong indicator, it can be even more annoying than usual for VMware management. Because in the context of the acquisition of EMC, Dell plans to pay $ 24.05 per share to EMC shareholders, but also $ 9.10 per title in the form of VMware inventory tracking. In line with the financial performance of the shares of the division it targets (this is the logic of tracking stocks), the potential value of these “reflection shares” has consequently fallen as the stock VMware value. This is undoubtedly what motivated the decision of the editor to backtrack, with the desire to reassure its shareholders.

60% of CAC40 companies have adopted Office 365

Société Générale, Alstom, Legrand, Vinci … The jewels of our economy are massively opting for Microsoft’s collaborative suite. Overview of these deployments.

The Microsoft steamroller is running. In the war against Google in the adoption of collaborative solutions in cloud mode in business, the group of Redmond has put the turbo. Google has been able to hang in recent years beautiful references among the jewels of the French economy – Air Liquide, Lafarge, Essilor or Valeo use its Google Apps for Work. But since then, Microsoft has worked hard, boosted by the talk “mobile first” and “cloud first” of its CEO, Satya Nadella. As a result, 60% of the CAC40 companies use all or part of their Office 365 suite. “Two years ago, we saw an acceleration of projects,” observes Nathalie Wright, director of the large companies and alliances division of Microsoft France.

630,000 Office 365 users identified in the Cac 40

In this folder, the Journal du Net has identified ten CAC40 groups that implement Offce 365, totaling more than 630,000 users of the Microsoft offer (see the table below). These most iconic projects? Announced a year ago, Société Générale’s DigitForAll program aims, for example, to install SharePoint, Skype for Business or Yammer on the workstation of 150,000 employees. At Accor and Vinci, migration projects exceed 100,000 users. (And our panorama does not take into account the 60,000 posts recently announced by Suez Environnement, the latter having left the CAC40).

Faced with Google, Microsoft plays on its assets: its historical presence in business, its dense network of partners and, of course, but also the data security component. This last point being the main obstacle to the adoption of SaaS services. “For a big account, the respect of the localization of the data and the data privacy are essential elements,” recalls Nathalie Wright. Microsoft recommends hosting sensitive data within the company’s own infrastructure or at a hosting provider. What makes Office 365 possible , since it gives the choice between a SaaS version … and an internalized version. An alternative that remains impossible with Google Apps.

First brick: messaging

The deployment of the various Office 365 modules is progressively carried out by the CAC40 actors involved. The first brick is usually messaging with the desire to switch to a new generation mailbox. “In 2013 and 2014, we saw a significant wave of IBM Lotus Notes migrations to Microsoft Office 365,” says Cyril Drevon, head of infrastructure activity at Exakis Lyon, a Microsoft Gold Partner.

Legrand has switched 20,000 Lotus Notes mailboxes to Exchange Online and Lync (now Skype for Business) in less than six months. In fact, email and instant messaging usually go hand in hand. For unified communications, change support is light. Users already often use this type of tool in their privacy – with Gmail or Outlook. “Deploying Skype at the same time as messaging creates a ‘wow’ effect, which generates a positive momentum,” confirms Maximilien Chayriguès, head of the firm EMS Conseil, which notably supported Alstom and LVMH.

SocietyNumber of usersNature of deployments.
Accor160,000Project initiated in 2013. Deployment of Office 365 (including SharePoint Online) for employees and franchisees around the world.
Alstom85,000Choice of Microsoft cloud solutions as early as 2010. Deployment of Office 365 including Skype for Business. Support by Orange Business Services
AxaNCDeploy Office 365 with Axa Global Direct.
BNP ParibasNCDeployment of Microsoft solutions including Skype for Business on Windows 8 tablets .
Capgemini8,000Yammer retained as a corporate social network. Deployment of Office 365 (Exchange and SharePoint) for employees of the Sogeti subsidiary as of 2013.
Engie
Great20,000Migrating Lotus Notes mailboxes to Exchange Online and Skype for Business. SharePoint and Yammer being deployed.
L’Oreal25,000Yammer retained as a corporate social network. In September 2014, deploying Office 365 in pilot phase on a few hundred users.
Societe Generale150,000Deployment of all Office 365 on workstations announced in September 2014. Delivery of 90,000 Windows tablets being finalized
Vinci183,000First large-scale deployment announced in September 2012. On its own, the subsidiary Vinci Energies has 42,000 active accounts (including Skype for Business, SharePoint and Yammer)

Next step, the collaborative

After email, the next step is the collaborative. Change management is this time more complex. It is necessary to introduce new uses, good practices, in the absence sometimes of existing. With Yammer, OneDrive and SharePoint, Office 365 offers a variety of tools for pushing and sharing content. The challenge is to highlight the most relevant brick for a given use. While some organizations are trying to eradicate email in-house, the risk would be great if not to stack the boxes (messaging, chat, corporate social network (CSR) …). Hence the importance of this support to change.

“A CAC40 company has for it to have a population seeking mobility and innovation solutions,” Cyril Drevon positive. “In these multinationals, employees are used to working in virtual teams around the world, especially in the research or marketing departments, just to guide them to good practices.” Nathalie Wright also sees it as an asset for these large accounts to attract and retain talent. With this type of tool, young Y-generation workers would not understand that we could work differently. 

Most important organizational issues

Another difficulty to overcome for CAC40 groups: the permanent evolution of these collaborative solutions with functional contributions about every 30 days. Maximilien Chayriguès counted 440 evolutions of Office 365 in one year! An inherent mode of SaaS but a disturbing hair for companies that are used to conducting a major migration project over two or three years, then move to another without looking back. “Large groups must be able to absorb these new features and return them to the right department, a new service can be interesting for marketing, another for IT,” says Cyril Drevon. Which supposes to have internal resources like transition manager or community manager. Profiles that have begun to appear among CAC40 actors.

Office 365 vs Google Apps for Work

While Microsoft asserts its legitimacy over its knowledge of the business world, it is amusing to see that the publisher has recently reconciled the names of Office 365 modules of its consumer solutions to facilitate their adoption. Lync has become Skype for Business, Exchange, Outlook.com. All the reverse of Google which, coming from the general public with a pure web approach, tries to seduce companies by professionalizing its offer. “The choice is there,” says Cyril Drevon. “The big groups who want to be in the continuity of the uses choose Office 365, those who are positioned on the ‘disruptive full web’ turn to the Google Apps,” he notes. The difference is not, therefore, in his eyes, the costs, very close.