Kategorien
AI business IT

Prediction: in 2023 we will finally see the beginning of a wider business adoption of machine learning and AI Services – and here’s why

At the end of 2022 ChatGPT made its way into the news and created a lot of fuzz.

The reason was, that OpenAI, the company behind ChatGPT, developed a new frontend for its generative learning model GPT-3. GPT-3 was being released one year earlier and has already been the largest model ever created. It was only accessible by an API though, for which one needed to make it through a waiting list. ChatGPT changed the game as being an easy to use, free for everybody „chat“ interface to interact with the GPT-3. Many users for the first time understood, what Machine Learning, or AI Services, are capable of: they created poems, let ChatGPT write yet another StarWars movie script and many other funny things. But understanding the underlying achievements OpenAI was able to come up to, are nothing less than stunning – and will teach many businesses what benefits AI Services can bring.

ChatGPT has made visible the potential that AI services have when they are skillfully combined, or the models that have technically been around for years are trained in a set with data that was previously unthinkable. GPT-3 contains about 10x as much data as previous models. More specifically, GPT-3 consists of multiple models and techniques like semi-supervised learning or trasnformers, that have been combined together intelligently – and that’s the fascinating part.

Generally, until now, there were a number of „capabilities“ that an AI model brought to the table, e.g. the classics like sentiment analysis („What is the sentiment in a certain text?“) or classification („Is the text a question or a statement?“).
This is now different: GPT-3 can not only do the above, but also learn new things very quickly with high efficiency and accuracy. This is called the Zero-, One- or Few-Shot capabilities of a model. Here GPT-3 achieves incredibly good values. This means, for example, that you can teach it to translate into a new language in just 3 „training sessions“, and from then on the model does it itself.

Why this is so important for companies: the ability to (autonomously) learn and adapt.

Every company claims to be unique. This may be the case in some areas, but often it is the cross-functional areas (IT, HR, Finance, etc.) that are essentially the same. The HR department of a bank does not do much different than the HR department of an automotive supplier. This also explains the success of the „general“ office products like Excel and Co. that are used in all companies (a spreadsheet like Excel, by the way, can be compared structurally well with an AI model). But WHAT is calculated in an Excel, that changes from company to company.
Modern AI architectures like GPT-3 are now able to learn exactly this by themselves:
1. what is my company specific data to work on?
2. what are my company-specific questions that I should answer?
3. what are my company-specific added values that I should deliver?

These capabilities, which ChatGPT now presents to users in a very concrete way, are what will now drive the entry of AI into companies. Because the above results are simply „shocking“ in a positive sense.
I see three areas in particular where we will see AI services much more often very soon:
1. integrated AI: e.g. directly integrated in a software to make predictions (besipiel Salesforce AI service that directly qualifies a lead).
2. standalone AI services (e.g. ChatBot that answers customer service questions on its own)
3. generating AI services: Corporate communications, marketing copytexts, sales presentations that a service creates autonomously and is only approved or tuned afterwards by a „real“ employee.

The productivity gains are enormous and the knowledge about the introduction of AI services, which skills and teams are needed, will also spread. Because one thing should be clear to everyone: AI Services are far more than a technical tool that can be introduced, but to an even much greater extent a corporate change than all „digitization measures“ combined. Digitization, compared, was a wet fart 

Questions? Comments? Want to chat? Contact me on Mastodon,Twitter or send a mail to ingmar@motionet.de

Kategorien
AI business IT

How Tensorflow can help HR departments to streamline their processes

TensorFlow is a powerful open-source tool funded by Google, that can help HR departments in a variety of ways. At its core, TensorFlow is a machine learning platform that allows users to build and train complex models using large amounts of data. This ability to process large amounts of data quickly and accurately makes TensorFlow an ideal tool for HR departments looking to improve their processes and make more informed decisions.

Tensorflow and Recruiting

One of the key ways that TensorFlow can help HR departments is by automating and improving the process of recruitment and selection. By training a model on large amounts of data (e.g. from SAP SuccessFactors, Workday etc.), HR departments can use TensorFlow to identify the most important factors in determining a successful candidate and automate the process of sifting through resumes and applications. This can save HR departments a significant amount of time and resources, and allow them to focus on other important tasks.

TensorFlow and Performance Management

Another area where TensorFlow can be useful for HR departments is in performance management. By training a model on data about an employee’s past performance, HR departments can use TensorFlow to identify patterns and trends that may indicate an employee’s potential for future success. This can help HR departments make more informed decisions about promotions, salary increases, and other important decisions related to employee performance.

TensorFlow can also be used to improve the accuracy and fairness of salary and compensation decisions. By training a model on data about an employee’s past performance, job responsibilities, and other factors, HR departments can use TensorFlow to identify any potential biases or inconsistencies in their current compensation practices. This can help HR departments ensure that their compensation decisions are fair and based on objective criteria, and can help to prevent discrimination and other potential legal issues.

TensorFlow and Reportings

In addition to these specific applications, TensorFlow can also help HR departments in more general ways. For example, TensorFlow can be used to automate and improve the process of generating reports and analytics, which can help HR departments make more informed decisions about the effectiveness of their policies and practices. Additionally, TensorFlow can be used to identify potential issues and trends within an organization, such as high turnover rates or low employee satisfaction, and provide HR departments with the information they need to address these issues.

TensorFlow to identify potential leaving employees

Traditional methods of predicting employee turnover often rely on manual analysis of a small number of data points, such as employee performance reviews or exit interviews. This can be time-consuming and may not provide a complete picture of an employee’s likelihood of leaving the company.

TensorFlow, on the other hand, can analyze vast amounts of data from various sources, including employee performance data, demographics, and other relevant factors. This allows HR departments to gain a more comprehensive view of an employee’s likelihood of leaving the company, enabling them to make more informed decisions about retention strategies. Traditional methods of predicting employee turnover may not be able to identify subtle patterns or trends that could be indicative of an employee’s likelihood of leaving the company. TensorFlow, on the other hand, can identify these patterns and trends, providing HR departments with valuable insights into the factors that may be contributing to employee turnover.

From Re-Action to Action: act on an employee, before he leaves.

One example of how TensorFlow can be used in the area of employee turnover prediction is through the development of a predictive model. This model could be trained using a large dataset of employee data, including factors such as performance metrics, demographics, and job satisfaction. The model could then be used to predict the likelihood of an individual employee leaving the company, based on the data provided: the model may identify that employees with low job satisfaction are more likely to leave the company. HR departments could then implement strategies to improve job satisfaction, such as offering training or career development opportunities, in an effort to reduce employee turnover.Another potential in the area of employee turnover prediction is through the development of an employee turnover dashboard. This dashboard could provide HR departments with a visual representation of employee turnover data, allowing them to easily identify trends and patterns. The dashboard could also provide HR departments with real-time alerts when an employee is at risk of leaving the company, allowing them to take immediate action to retain the employee.

TensorFlow vs. Azure Cognitive Services in HR processes

As stataed above, TensorFlow but also Azure Cognitive Services are both powerful tools for machine learning and artificial intelligence (AI) applications. While TensorFlow is an open-source library for machine learning and deep learning applications, Azure Cognitive Services is a suite of AI services provided by Microsoft. Both tools have their own advantages and disadvantages, which should be considered when deciding which to use for a particular project.

One major advantage of TensorFlow is its flexibility. TensorFlow allows developers to build and train their own custom machine learning models, which can be tailored to specific applications and data sets. This flexibility can be particularly useful for complex projects that require specialized models or algorithms.

Another advantage of TensorFlow is its ability to handle large amounts of data. TensorFlow is designed to scale to large data sets, allowing it to handle large volumes of data without sacrificing performance. This makes it ideal for projects that require the analysis of large amounts of data, such as natural language processing or image recognition.

However, TensorFlow also has some disadvantages. One of the main disadvantages of TensorFlow is its complexity. TensorFlow is a powerful tool, but it can be difficult for beginners or unexperienced IT deaprtments to learn and use. In order to use TensorFlow effectively, developers need to have a strong understanding of machine learning algorithms and techniques, as well as experience with programming languages such as Python.

In contrast, Azure Cognitive Services is a more user-friendly tool. Azure Cognitive Services provides pre-trained machine learning models that can be easily integrated into applications without the need for extensive programming knowledge. This makes it a good choice for developers who are new to machine learning or who want to quickly add AI capabilities to their applications.

Another advantage of Azure Cognitive Services is its availability. Azure Cognitive Services is available as a cloud-based service, which means that developers can easily access and use the service without the need to install any software or hardware. This can be particularly useful for developers who are working on projects that require fast deployment or who do not have access to dedicated machine learning hardware.

However, Azure Cognitive Services also has some disadvantages. One major disadvantage of Azure Cognitive Services is its cost. Azure Cognitive Services is a subscription-based service, which means that developers need to pay for the service on a monthly or annual basis. This can be expensive, especially for projects that require the use of multiple Azure Cognitive Services.

Another disadvantage of Azure Cognitive Services is its lack of flexibility. Because Azure Cognitive Services provides pre-trained models, developers are limited to using the models that are provided by the service. This can be limiting for projects that require custom models or algorithms.

In conclusion, TensorFlow and Azure Cognitive Services are both powerful tools for machine learning and AI applications. TensorFlow offers flexibility and the ability to handle large amounts of data, but it can be complex and difficult to use. Azure Cognitive Services is user-friendly and available as a cloud-based service, but it can be expensive and lacks flexibility. The best choice between the two will depend on the specific requirements of the HR project and the experience and expertise of the development team.

In my company my-vpa.com, which basically is a HR Tech company, we mainly use Azure and AWS Comprehend for our HR processes. So for example we implememented an AI powered zero-touch recruiting process which is capable of recruiting up to 200 Assistants per month.

Questions? Comments? Want to chat? Contact me on Mastodon,Twitter or send a mail to ingmar@motionet.de

Kategorien
AI business IT

Ai and the importance of Data Governance

Data governance and AI are two important concepts that are closely related and can work together in an enterprise to improve the efficiency and effectiveness of business operations. Let me lay out, why there’s no AI powered process without proper data Governance (DG):

What is data Governance?

At a high level, data governance refers to the processes and policies that are put in place to manage and oversee the collection, storage, and use of data within an organization. This can include defining roles and responsibilities for data management, establishing standards and protocols for data quality and security, and implementing systems for monitoring and auditing data usage.

In an enterprise, AI and DG can work together in several ways: For example, data governance can help ensure that the data used for AI models is of high quality and is properly managed and protected. This can involve implementing processes for verifying the accuracy and completeness of the data, as well as setting up systems for securing the data and monitoring its usage.

Additionally, data governance can help to ensure that the AI models being used by the enterprise are fair, ethical, and transparent. This can involve establishing guidelines and protocols for evaluating the performance and biases of AI models, as well as implementing systems for monitoring and auditing their usage.

Here are some examples of how data governance and AI can be integrated in an enterprise:

  • Developing a comprehensive data strategy that outlines the goals and objectives of the organization’s AI initiatives, as well as the roles and responsibilities of various teams and individuals involved in data management and AI development.
  • Establishing clear policies and guidelines for the collection, storage, and use of data, including guidelines for data quality, security, and privacy.
  • Implementing processes for data access and decision-making that ensure that data is used consistently and ethically, and that the organization’s AI models are trained and evaluated on a diverse and representative dataset.
  • Establishing a data governance board or committee that is responsible for overseeing the organization’s data governance and AI initiatives, and for making decisions about the use of AI in the organization.
  • Implementing regular training and education programs for employees on topics related to data governance and AI, to ensure that everyone in the organization is aware of the organization’s policies and practices.

Questions? Comments? Want to chat? Contact me on Mastodon,Twitter or send a mail to ingmar@motionet.de

Kategorien
business IT sustainability

GDPR related fines have risen significantly. And thats good.

In a panel discussion this week, I was arguing with the great Tom Moran wether GDPR in Europe was being properly pursued by regulators. My thesis was, that officials will offer an unofficial 5 year transition period (that would be until 2023) until when fines will really be enforced. Now ft reports, fines have risen 40% in the last year, which underpins my thesis.

And this is a good thing. Why?

Not going too much into detail, but I’m a fan of GDPR (and no, not of the stupid cookie-banner-hell). In short, GDPR consists of a whole stack of

  1. Organizational-
  2. Technical- and
  3. Accountability regulations.

Especially (3) is important, as it shifts accountability from consumers to companies. Done right, GDPR can bring huge competitive advantage to companies and organizations, as trust an transparency is on the rise in consumer demand. The higher the trust, the

  1. longer the customer stays with your company,
  2. the customer adds data / assets to our company

leading to a higher customer live time value. And that is sustainable.

So why is enforcing GDPR law a good thing and the rise in fines a good signal? When there’s a new law in town, enforcing can be a hard thing. Law-suits will become daily business. That the fines are rising now says, that officials become more sure, to be on the winning side. And therefore we, the customer, too.

Kategorien
business

notion for…

Somebody on the Twitterverse recently wrote „build notion for…“ will become a standard term. I usually don’t like these „uber for“ comparisons, but in this case there’s a point. notions power comes from its ever growing templates gallery.  I see a market where you could say: „our working-template for…

  • facebook campaigns
  • twitter campaigns
  • newsletter-list building
  • online webinars“

is XXX. And you as a customer don’t need to reinvent the wheel anytime you start a campaign. Make these work-templates best in class by getting the „rockstars“ of that vertical on bord and there’s a shortpath to marketdomination.

Pro Tip: start implementing these work-templates by using virtual assistants from my-vpa 😉

Kategorien
business

Remote Work, Mobiles Arbeiten oder Telearbeit: hauptsache Kommunikation!

Wie man es auch nennen mag, ob Remote Work, Mobiles Arbeiten oder Telearbeit. Dahinter steckt: ein oder mehrere Mitarbeiter, die räumlich und ggf. zeitlich ungebunden sind und in einem Team zusammenarbeiten (sollen). Der Klebstoff, der das zusammenhält ist Kommunikation.

Wir bei my-vpa sind ein 100% Remote Team und deswegen haben meine Kollegen*innen mal ein paar Best Practice Regeln in einem Guide für Remote Work Kommunikation aufgeschrieben. Finde ich lesenswert und echt hilfreich. Wer seine Email Adresse hinterlässt, kann sich das Whitepaper hier herunterladen und bekommt im Anschluss noch ein paar weitere Tips. Macht mal!

 

 

Disclaimer: ich bin Mitgründer von my-vpa.

Photo by Jason Rosewell on Unsplash

Kategorien
business my-vpa

What’s your career path? A matrix approach to career building at my-vpa.

When it comes to company building, core questions to be answered are:

  1. how do you want to set up team structures
  2. how do you want your emplyoees to find their direction and
  3. how you want to incentive them.

During my research for impulses to these questions, I found progression.fyi  a valubale source of inspiration to different, yet transparent apporoaches.

At my-vpa we want to offer a transparent approach to career building

Most notably, all represented companies offer an open-approach to your career path. I find that a smart move to show potential new colleagues their way through the companies „hierarchy“ even before they actually start.

For us I can imagine a kind of career matrix:

  • X-Axis representing the level of cooperation-ability, say „how much influence / repsonibility“ do you have in the company
  • Y-Axis representing your technical skills and levels for each skill

my-vpa career matrix

So this approach mainly values the contributions each collegaue adds to the company, being it as a simple contributor, but highly skilled, or being it at managing an integrated team, but on another technical skill level.

Seite ohne Titel.png

Meaning: a collegau ebeing pisitoined in square C3 adds a similar value to the company as a collegaue in square A3 and thus is being compensated in the same way.

„Technicial“ in this context can mean all disciplines, being it marekting, HR, finance or sales.

This is still work in progress and not yet implemented at my-vpa.

My toughts were also inspired by 8thlight.
Kategorien
business

Okanda X Spacebase

Mit der Veröffentlichung, dass wir die Okanda an die Spacebase verkauft haben, geht eine aufregende Reise zu Ende, auf der ich mich die letzten 5 Jahre befunden habe.

Okanda wurde 2014 gegründet mit dem Ziel Meetingräume in Hotels in Echtzeit an Kunden zu vermitteln – etwa ein Jahr später startete Spacebase in Berlin. Spacebase vermittelt Meetingräume vornehmlich in stylischen Locations und Offices.

Als Investor und Vorstand der motionet AG habe ich damals die Gründung durch Dirk Führer (Ex CCO Steigenberger) begleitet. Investiert haben darüber hinaus noch weitere Partner aus unserem Netzwerk.

In 2016 habe ich auf Bitten des Aufsichtsrates die Geschäfte der Okanda AG als Interim Vorstand übernommen und das Unternehmen restrukturiert: mit unserem Unternehmen „Aporia Ventures“ sind wir auf die Gründung und Restrukturierung von digitalen Unternehmen spezialisiert.

Aporia“ hat eigene schlagkräftige und pragmatische Vorgehensmodelle entwickelt, die neben rein finanziellen Optimierungsmaßnahmen ebenso die kritischen Bereiche IT, HR, Sales und Marketing umfassen. Mit diesen datengetriebenen Modellen sind wir in der Lage, neue Unternehmen nachhaltig zu gründen und bestehende zu transformieren und zu restrukturieren. Dadurch ist es gelungen, die Okanda in diesem stark wachsendem Marktsegment erst zu stabilisieren und dann als führenden Player zu etablieren.

Julian von Spacebase habe ich das erste Mal in 2016 getroffen und als einen dynamischen, zielgerichteten und sympathischen CEO und Brancheninsider erlebt. Wir haben direkt einen „Draht“ zueinander entwickelt und über die Jahre hinweg Kontakt gehalten. Mir ist es immer wichtig, auch die „Konkurrenz“ kennenzulernen – und sich gerade in neuen Märkten und Verticals, in denen sich digitale Unternehmen oft befinden darüber auszutauschen, wie sie sich entwickeln und welche Dynamiken existieren.

Ich sehe in dem Markt ein riesiges Potential: mit dem Aufstieg von WeWork und Co wird immer klarer, dass flexibles arbeiten, präsentieren und workshopppen das Zukunftsmodell der Arbeitskultur sein wird – zumindest mal für alle „KnowledgeWorker“.

Spacebase und Okanda rocken 😉

Kategorien
business IT

Scanbot und my-vpa für automatische, komplexe Arbeitsabläufe. Heute: Quittungen

Manch einer sagt ja „Quittungen sammeln und sortieren: Toll, das mach‘ ich an einem Regentag“. Mir ist das schon seit je her lästig, umso mehr freut es mich, dass wir über das Sharing Feature unserer neuen my-vpa App für iOS und Android  jetzt Dokumente von einer anderen App direkt in die my-vpa App sharen können. Und das funktioniert so:

1. Ich habe eine Aufgabe bei my-vpa für meinen Assistenten erstellt, die in etwa lautet „Alle Quittungen kommen als PDF, bitte daraus eine summierte Abrechnung für den Vormonat erstellen und an unsere Rechnungsabteilung per Mail, mich in cc, senden.“

2. Ich nutze Scanbot für iOS und scanne einmal im Monat für den Vormonat alle Quittungen.

3. Ich klicke auf „Sharen als PDF“ und lade damit automatisch das PDF in die richtige Aufgabe bei my-vpa hoch

IMG_2386

4. Mein VPA erstellt mir die Abrechnung und sendet diese an die Rechnungsabteilung unserer Firma

Das tolle ist, dass das systemweit funktioniert, ich also aus allen anderen Apps, wo sich etwas „sharen“ lässt, die Dokumente direkt zu den passenden Aufgaben in der my-vpa App hochladen kann. Das können Bilder, Sprachnotizen, Videos oder Dokumente wie Rechnungen, Korrekturen, Bestellungen, Benachrichtigungen, Anrufbeantworter-Sprachnachrichten etc. sein. Als nächstes baue ich mir einen Workflow, von wo aus bestimmte Dokumente direkt an den Steuerberater gesendet werden 🙂

 

Kategorien
business IT

Mein Web-Analytics Setup mit KPI für Ecommerce Sites

Da ich immer wieder nach KPIs und passendem Analytics Setup für Ecommerce Sites gefragt werde dachte ich, schreibe ich einmal die für mich wichtigsten hier zusammen. Alle diese KPIs lassen sich mit Google Analytics und Supermetrics und Google Sheets relativ einfach umsetzen. Parameter für Conversion Values etc. müssen je nach Ecommerce Suite dann gesetzt und angepasst werden. Ich habe die KPIs in unterschiedliche Kategorien unterteilt:

Return on Invest (ROI) messen

Hier soll gemessen werden, welcher Return hinter den jeweiligen Maßnahmen/Kampagnen etc. steht.

  1. Cost per Visit: Was kostet ein Besucher (Google Kampagne 1 kostet pro Klick xx, Facebook Kampagne 2 kostet pro Klick xx,…), darüber hinaus: Monatliche Kosten für SEO Optimierung (SEO kostet auch)
  2. Cost per Sale: Was kostet ein Verkauf?
  3. Sales per Channel: Wie viele Verkäufe (mit welchem Wert) finden pro Kanal (facebook, Adwords, Suche) statt?
  4. Sales per Visit: Anzahl / Wert der Verkäufe pro Site
  5. Purchase History: Was hat ein Kunde wann zu welchem Wert gekauft?
  6. Cost per KPI: Was kostet eine Microconversion (White Paper Download, Newsletteranmeldung, …)
  7. Time to Conversion: wie oft muss ein Kunden kommen / nach welcher Zeit macht er eine Conversion.
  8. Cart Abandonment Rate: Trichteranalyse: an welchem Schritt steigen welche Kunden im Checkout Prozess aus?
  9. Average Order Value: Wie hoch ist das durchschnittliche Transaktionsvolumen?

 

KPIs um Lead Generation Kampaganen zu messen

Hier soll gemessen werden, wie die conversions für Micro- / Macroconversions sind.

  1. Product or Service Page Conversion Rate: wie ist die Conversion Rate für Micro- und Macroconversions (Newsletter, Whitepapers, Käufe)
  2. Landing Page Bounce Rate: Wie hoch ist die Bounce Rate für bezahlte Kampagnen (Wie effektiv sind die Kampagnen?)
  3. Landing Page Conversion Rate: wie gut performen die jeweiligen Kampagnen?
  4. Email Open Rate: welche Newsletter werden wie oft geöffnet?
  5. Email Click Through Rate: Welche Artikel im Newsletter werden wie oft geklickt (und führen zu convesions)

 

KPIs um die Kaufabsichten zu messen.

Hier soll ausgewertet werden, wie hoch die potentiellen Kaufabsichten auf der Seite sind

  1. Branded Keyword Visits: Auswertung der Suchergebnisse: wie oft wurde durch optimierte Keywords zugegriffen?
  2. Direct Visits: Wie viele Zugriffe hat es direkt auf die site gegeben?
  3. Auswertung z.B. Standortfinder: welcher Standorte hatten welche Aufrufe und welche Kontakte über die Seite.
  4. Direct Email Rate: wie viele Mails mit welchem Intent (Kaufabsicht) kommen über die Plattform?
  5. Call Rate / Chat Rate: Wer hatte wie viele Anrufe / Chat anfragen?

 

KPIs um die Nutzung der Plattform zu messen.

Hier soll gemessen werden, wie das Verhalten Onsite / Offfsite ist.

  1. 301 redirect rate: Für offline Ads (Kataoge etc.) sollte eine Vanity URL mitgegeben werden (z.b. site.com/2018) die über einen 301 redirect auf eine Landingpage führt
  2. 301 Conversion Rate: wie konvertieren die Offline Maßnahmen?
  3. Inbound Links: Wer verlinkt mit welchem Resultat auf die Site?
  4. Visitor Loyalty (new vs. Returning(Cohort)): Welche Kunden kommen wann und wie oft wieder?
  5. Follower Growth Rate: wie entwickeln sich die Follower Zahlen auf Facebook, Twitter, Insta etc. (Google-Sheet mit Supermetrics)
  6. Social Media Share Rate: Wie oft werden welche Inhalte geteilt?
  7. Comment Rate: welche Inhalte werden wir oft kommentiert?
  8. Pages per Session: wie viele Seiten ruft ein Besucher pro Besuch auf?
  9. New Visitors: wie viele Visitors in einem Zeitraum?