AWS Comprehend: connecting with python

When you start adding AI services, python is handy to help with simple connection tools. Today: boto3

This is the simple connect I wrote, you can also get it on my github:

# python file to ask amazon comprehend for sentiment
import boto3

# Replace the following with your own AWS access key ID and secret key
aws_access_key_id = "YOUR AWS KEYID"
aws_secret_access_key = "YOUR AWS KEY"

# Create a boto3 client for the Amazon Comprehend API
comprehend_client = boto3.client("comprehend", aws_access_key_id=aws_access_key_id, aws_secret_access_key=aws_secret_access_key)

# Use the Amazon Comprehend API to analyze some text

text = "Danke, das haben Sie gut gemacht."
response = comprehend_client.detect_sentiment(Text=text, LanguageCode="de")

# Print the detected sentiment

Its simple an straight forward. To get it up and runninf have the follwoing prerequsitites in place:

  1. You need AWS console access
  2. Create program based IAM Access to group AWS comprehend edit
  3. Install python3, pip, awscli
  4. Edit .aws config to use a region, e.g. eu-central-1
  5. pip install boto3


Now that you want to integrate AI in your custom built software: which are the best OpenSource modelling tools out there?

There are several open source AI model tools available, each with its own unique features and capabilities. Some of the most popular options include TensorFlow, Keras, PyTorch, and scikit-learn.

TensorFlow is a powerful open source library for deep learning, developed by the Google Brain team. It allows users to build and train complex neural network models for a variety of tasks, including image recognition, natural language processing, and time series forecasting. TensorFlow is highly scalable and can be used for both research and production environments.

Keras is a high-level API for building and training deep learning models. It is built on top of TensorFlow and is designed to be easy to use and intuitive for developers who are new to deep learning. Keras allows users to quickly prototype and experiment with different architectures and hyperparameters, making it a popular choice for researchers and data scientists.

PyTorch is another popular open source library for deep learning. It is developed by Facebook AI Research and is designed to be flexible and easy to use. PyTorch allows users to build complex neural network models and perform computations on tensors, a data structure similar to matrices. PyTorch is known for its support for dynamic computational graphs, which allow users to build models on the fly and modify them during training.

scikit-learn is a machine learning library for Python that is widely used in the data science community. It offers a wide range of algorithms for classification, regression, clustering, and dimensionality reduction, along with tools for model evaluation and selection. scikit-learn is designed to be easy to use and can be integrated with other libraries, such as NumPy and Pandas, to create powerful data analysis pipelines.

All of the above are written in python. Pytorch is also ported for JAVA and C++. scikit-learn is also written in Python, but it is focused on traditional machine learning algorithms rather than deep learning.

A difference is the level of abstraction provided by the libraries. TensorFlow and PyTorch offer low-level APIs that allow users to build and customize their own neural network architectures, while Keras provides a higher-level API that allows users to quickly build and train pre-defined architectures. scikit-learn offers a more general-purpose API for traditional machine learning algorithms.

In terms of performance, TensorFlow, Keras, and PyTorch are all optimized for training deep learning models on large datasets and can be used to build models that can run on GPUs and TPUs. scikit-learn is optimized for smaller datasets and can run on CPUs, but it may not be as efficient for larger datasets.

Hope this overview helps to find the model builder you want to go after 🙂

Questions? Comments? Want to chat? Contact me on Mastodon,Twitter or send a mail to ingmar@motionet.de

Allgemein invest VAN

VAN in Kress

Wir sind halt schon ein bisschen stolz darauf, dass es uns nächstes Jahr dann schon seit 10 Jahren gibt und wir aus der „Nische“ Klassische Musik nicht mehr wegzudenken sind: Danke Kress für das kurze Portrait:

Im übrigen betrieben wir seit ein paar Wochen auf https://classicalmusic.social unsere eigene Mastodon Instanz. Kommt mal gucken!


What is the difference of machine learning and deep learning algorithms?

In the process of applying AI to business-usecases, one has to consider two different learning algorithms, which perform significantly better in their specific area:

  1. Machine learning algorithms
  2. Deep learning algorithms

Let’s dig a bit deeper into this:

Machine learning and deep learning are two subfields of artificial intelligence (AI), with deep learning being a subset of machine learning. While both technologies are based on the concept of enabling machines to learn from data, there are key differences between the two that set them apart.

What’s the level of human intervention needed?

One of the main differences between machine learning and deep learning is the level of human intervention required. Machine learning algorithms require human intervention to a certain extent, as they rely on human-defined rules and algorithms to analyze data and make predictions. In contrast, deep learning algorithms are capable of learning on their own, without the need for human intervention. This makes deep learning algorithms more efficient and effective at handling complex tasks and data sets.

What type of data can they handle better?

Another key difference between the two technologies is the type of data they can handle. Machine learning algorithms are typically used to analyze structured data, such as numbers and text. This means that they are well-suited for tasks such as image and speech recognition, where data is already organized in a specific format. In contrast, deep learning algorithms can handle both structured and unstructured data, such as images, videos, and audio. This makes deep learning algorithms better suited for tasks that require the analysis of complex and unstructured data.

Differences in Performance

In terms of performance, deep learning algorithms are generally more accurate and efficient than machine learning algorithms. This is because deep learning algorithms can learn and adapt to complex data patterns and relationships, while machine learning algorithms rely on human-defined rules and algorithms. As a result, deep learning algorithms are better suited for tasks that require high accuracy and precision, such as image and speech recognition.

Example of a machine leraning algorithm

One example of a machine learning algorithm is a decision tree. Decision trees are a type of algorithm that uses a tree-like structure to make predictions based on a set of rules and conditions. The algorithm starts at the root of the tree and follows a series of rules and conditions to make a prediction. For example, in the task of predicting whether a customer will churn or not, a decision tree algorithm might start by evaluating the customer’s tenure with the company. If the customer has been with the company for a long time, the algorithm might conclude that they are unlikely to churn. If the customer has been with the company for a shorter period of time, the algorithm might evaluate other factors, such as their usage of the company’s services, to make a prediction. This process continues until the algorithm reaches a leaf node, where it makes a final prediction. Decision trees are effective at handling structured data and making accurate predictions, but they require human intervention to define the rules and conditions used in the algorithm.

Example of a deep learning algorithm

One example of a deep learning algorithm is a convolutional neural network (CNN). CNNs are a type of deep learning algorithm that is commonly used for tasks such as image and speech recognition. A CNN works by taking an input image and passing it through multiple layers of filters and transformations. Each layer of filters is designed to identify specific patterns and features in the image, such as edges and shapes. As the image passes through each layer, the algorithm learns and adapts to the data, identifying more complex patterns and relationships in the image. This allows the algorithm to make accurate predictions about the content of the image.

Hope this helps a bit to understand the differences 🙂

Questions? Comments? Want to chat? Contact me on Mastodon,Twitter or send a mail to ingmar@motionet.de


AI Enterprise Architecture

In Enterprise tech we are entering a new stage of AI being integrated into business processes to leverage its full potential. Therefore Enterprise Architecture has to adapt and embrace AI serivces, models and technology into their frameworks.

What is AI Enterprise architecture?

AI Enterpise architecture is the framework or blueprint that guides the design and implementation of artificial intelligence systems. It defines the components and interactions of an AI system, and outlines the relationships between the different components.

AI Enterprise architecture focuses on the specific components and technologies that make up an AI system. This can include the algorithms and models that are used for machine learning, the hardware and software infrastructure that supports the AI system, and the data sources and storage systems that are used to train and evaluate the AI system.

AI Enterprise architecture is a crucial part of IT enterprise architecture, which is the overall framework for the design and implementation of an organization’s IT systems. IT enterprise architecture provides a common language and set of principles for understanding, designing, and implementing IT systems, and helps to ensure that these systems are aligned with the organization’s business goals and objectives.

The integration of AI Enterprise architecture into IT enterprise architecture can help to ensure that AI systems are designed and implemented in a way that is consistent with the organization’s overall IT strategy. It can also help to ensure that AI systems are integrated seamlessly with the rest of the organization’s IT systems, and can provide the necessary data and resources to support the AI system’s operation.

In addition, technical AI architecture can help to identify potential gaps and overlaps in the organization’s AI capabilities, and can provide a framework for prioritizing and addressing these gaps. This can help to ensure that the organization’s AI investments are focused on the areas that will provide the greatest benefit, and can help to avoid duplication of effort and resources.

In general we can divide AI services into different areas:

  1. integrated AI services like OCR or AI services within software like MS Teams. These are preconfigured services, very spefic to the exact usecase
  2. External cloud based services like Azure Cognitive Services with pre-trained machine learning models that developers can use to add specific capabilities to their applications
  3. Software libraries like tensorflow: TensorFlow is a free and open-source software library for machine learning and artificial intelligence. It was developed by Google and is used by many large companies and research institutions to build and train machine learning models. TensorFlow is particularly well-suited to deep learning, which is a type of machine learning that involves training neural networks on large amounts of data. TensorFlow provides a powerful set of tools for building and training these neural networks, including a library of pre-built neural network modules, algorithms for optimizing the training process, and tools for visualizing and debugging the training process. One of the key features of TensorFlow is that it allows users to build and train machine learning models on a wide range of platforms, including desktop computers, mobile devices, and cloud-based systems. This makes it easy for users to develop and deploy machine learning models in a variety of different environments.

How can companies benefit from a powerful AI Enterprise architecture? As an example: HR

Here are some examples of how AI can be used to improve HR processes and make them more efficient:

  • Recruitment: AI can be used to automate many of the tasks involved in recruiting new employees. For example, AI algorithms can be used to sort through large numbers of job applications and identify the most qualified candidates based on their resumes and other materials. This can save HR professionals a lot of time and effort, and allow them to focus on other important tasks.
  • Employee retention: AI can also be used to help companies retain their best employees. By analyzing data on employee behavior and performance, AI algorithms can identify potential risks of employee turnover, such as low job satisfaction or high levels of stress. This can help HR professionals take proactive steps to address these issues and improve employee retention.
  • Performance management: AI can be used to automate the process of performance evaluations for employees. By analyzing data on employee performance, AI algorithms can provide managers with insights into which employees are meeting their goals and which may need additional support. This can help HR professionals ensure that employees are being evaluated fairly and consistently, and that they have the support they need to succeed.
  • Learning and development: AI can also be used to improve learning and development programs within a company. By analyzing data on employee skills and career goals, AI algorithms can suggest personalized learning paths for employees, helping them to develop the skills they need to advance in their careers. This can help HR professionals provide employees with the support they need to grow and succeed within the company.

As you can see, AI has the potential to greatly benefit HR departments by automating many of the tasks involved in managing employees and improving the efficiency of HR processes. By using AI technologies, HR professionals can save time and effort, and focus on providing the best possible support for employees.


Overall, the integration of technical AI architecture into IT enterprise architecture can help to ensure that AI systems are designed and implemented in a way that is aligned with the organization’s business goals and objectives, and can help to optimize the value of these systems for the organization.

Questions? Comments? Want to chat? Contact me on Mastodon, Twitter or send a mail to ingmar@motionet.de


IRC ftw

Inspired by being active in the social web again (aka mastodon) I reconnected to irc. It’s really fun again and feels a bit… cosy. I startet my „online career“ there 25 years ago.

Still remembering some shortcuts („/nick fredl79“), we now have „nickserv“ which was an issue in the old days.

I hang out mostly on Libera, which has all the old channels like programming, postresql… u name it. Also seems to be one of the larger communities out there.

Of course there still is the issue of spam-kiddies, but mostly I find the conversions pleasant and polite.

Using Textual7 on the Mac to connect – no intent to install on mobile though.

IT sustainability

moving to the macbook pro m1 16″

My 2017 MacBook Pro 13″ Batterry life went down to about 45 mins. I can’t blame it, since its running basically round the clock since 4 years.

After (another) temporary switch to the Dell Windows world, I became so annoyed I decided to get the new M1 16″ – 16″, beacause my eyes are not getting any better 😉

I went for the M1 PRO and not MAX because of my daily tools inlcude more Teams than Xcode nowadays, so that seemed to be fast enough. I went for 32GB RAM though.

Boy, is this a fast machine. There is NO app (Teams, Safari, Excel…) which is NOT starting in under 1 sec. I can have endless Safari windows and tabs, and I don’t notice any slowing down. But the most amazing thing – is the battery life. I literally use Teams and Slack all day, and especially Teams is infamous for its performance.

Battery Life of my Macbook Pro
Battery Life of my Macbook Pro after 9 hours of heavy usage

The screenshot above shows 1:12 hours of operation still possible, and this is after about 9 hours since last charge and heavy usage. As far as I know, battery capacity is the same as in the older models, but this is the Apple Silicon which has such a fantastic power to energy consumption ratio.

remarks on the posting picture: Its a photo I took from a revent trip to Amsterdams Straat Museum – not sure if I am allowed to use it here, but its become my desktop background since then.

IT sustainability

Sustainability of IT. Sustainability by IT.

Last week I wrote about the inherent connection of Sustainability and IT and why digitalization should be at the heart of every Sustainability effort.
Today I want to go into more detail of the two dimensions of

  1. Sustainability of IT.
  2. Sustainability by IT.

Sustainability of IT.

You may have heard of the term „Green-IT“ which origins somewhere from the 90s. IT all started with the ’92 established Energy-Star label, introduced by the U.S. Environmental Protection Agency. The label promoted and made energy efficient monitors and other hardware more recognizable. For example, the sometimes better or worse working „Sleep-Mode“ was adopted by more and more hardware producers and OS developers as a prerequisite to get the label. Today you may have seen the Swedish TCO label on the back of your monitor which stands for low magnetic and electric emissions.
Whilst these labels cover the hardware side of Green IT, on the software side we don’t have any certifications or labels yet.
On that side though lies significant high leverage, as there is an analogy to your driving style:

  • the hardware is your car – you want to have that working efficiently
  • the software is your driving style: your car might consume 3l at 80km/h – if you take it to 200km/h it will consume exponentially more.

Besides having no label, there though are several initiatives and principles on how to write and operate efficient software. For example using hashed or indexed algorithms for search is much more efficient than using linear ones.

Another high, but rarely recognized leverage, lies in the retirement of software. As a study by the German Bundesamt for Environment in 2019 laid out, about 30% of software installed in datacenters is not being used. All tough latest studies show, that the effectiveness of datacenters during recent years improved drastically due to high effective coolings and more and more switching to green energy, we are talking an estimated total co2 emission of roughly 900 billion kilograms here (2018 Pearce) which is about the global aviation industry (pre Corona, Air Transport Action Group 2020). Reducing that by 30% is a lot.

As sustainability is not only about emissions and energy consumption but about decent work and economic growth, reducing risks of software-hacks and data-loss is a further essential goal to be achieved by software retirement.

Sustainability by IT

There are a lot of examples in which IT helps Sustainability efforts of organizations:

  1. Reducing emissions for business travel through virtual meetings
  2. Proving a transparent supply chain by using blockchain technology
  3. Reducing poverty by enabling internet access in 3rd world countries and teaching development skills

Though IT and digitalization can add a lot to reach SDG goals and are a key driver, one has to be aware of the drawbacks:

  1. Digital platforms can lead to alienation and loneliness
  2. Training Algorithms can be tricky and lead to faulty, discriminating results
  3. IT for itself is not sufficient: business-processes have to have people in their focus, have to serve the people with purpose

In this article I tried to lay out the two different aspects of IT and Sustainability. I hope you liked it and I am happy to find out about your opinion to IT, Digitalization and Sustainability.

If you have more questions or ideas regarding this, visit us over at https://www.motionet.de

Pearce, F. (2018). “Energy Hogs: Can World’s Huge Data Centers Be Made More Efficient?” Yale Environment 360, April 3rd, Https://E360.Yale.edu/Features/Energy-Hogs-Can-Huge-Data-Centers-Be-Made-More-Efficient
Air Transport Action Group (2020). Facts And Figures. Https://Www.atag.org/Facts-Figures.html

IT life

Setup your mac – the zero touch way

My 2017 MacBook Pro was getting slower over the time, and with everything in my Dropbox and in the iCloud backup, I dared to do a fresh install. I read about scripting your MacBook setup a while ago, so the nerd in me went down that road – and we went well.

With the help of „brew“ and „mas“ I put together a setup a bash script, which sets up your Mac nearly zero touch. and here it goes:

  1. We start with checking, if the script has already run.
#!/usr/bin/env bash

if [ -f ~/.osx-bootstrapped.txt ]; then
  cat << EOF
~/.osx-bootstrapped.txt FOUND!
This laptop has already been bootstrapped
Exiting. No changes were made.
  exit 0

2. We are going to install brew, for some of the essential apps

# Install Brew
if [[ ${BREWINSTALLED} == "" ]]; then
  echo "Installing Brew"
  ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"

3. Now these are the apps I like to set up. You’ll find a list over at brews.

#Required App
brew tap homebrew/cask
brew install mas
brew install --cask google-chrome
brew install --cask slack
brew install --cask spotify
brew install --cask rectangle
brew install --cask tunnelblick

4. You’ll notice, I’ve installed „mas“ – with mas you can connect to the AppStore via CLI – very useful for this case.
>> mas list – lists your current installed apps
>> mas search Remote searches the AppStore for all apps containing „remote“ in it. For the script, you need the IDs of the apps.

mas install 405399194
mas install 1418401222
mas install 1484348796
mas install 747648890
mas install 1333542190
mas install 443987910
mas install 1437501942
mas install 504700302
mas install 1147396723
mas install 824183456
mas install 1295203466
mas install 803453959
mas install 1444383602
mas install 462054704
mas install 462058435
mas install 462062816
mas install 985367838

Now you know my apps being installed 😉

5. I love oh-my-zsh, so its lastly being installed:

if [ ! -d ~/.oh-my-zsh ]; then
    sh -c "$(curl -fsSL https://raw.githubusercontent.com/robbyrussell/oh-my-zsh/master/tools/install.sh)"
    rm ~/.zshrc
    ln -s $CURRDIR/zshrc-config ~/.zshrc

6. Create the final file, so we know we did it.

touch ~/.osx-bootstrapped.txt

Final thoughts:

  • The MacBook feels much faster again. And battery consumption went down significantly . Even the error message for the battery went away
  • iCloud sync takes ages: accounts and keychain are still not completely synced, even after 3 hours. The come, tough, bit by bit. But it takes patience.
  • I restarted the MacBook in beginning , holding down „command + r“ and used the disk erase utility to erase the main disk
  • Office 365 didn’t install via mas – I had to do that manually.
  • put the final bash file somewhere into the cloud like dropbox, so that you can download and start it after you’ve erased your MacBook


Odoo als ERP für SMB’s

Seit ein paar Jahren nutzen wir bei Aporia in unseren Portfolios Odoo ERP – ein WebBased OpenSource ERP System, basierend auf python und postgres. Odoo ist der Nachfolger des alten OpenERP und hat auch einen kommerziellen Arm, der sich vorwiegend durch Hosted solutions und premium plugins sowie Support finanziert. Im kostenfreien Community-Paket sind allerdings schon eine Menge Features enthalten, mit denen man ein kleines Unternehmen aufziehen kann. Dazu gehören:

  1. Hübsche Website-Funktionalitäten mit Basis-Shopfunktion
  2. Angebots-, Auftrags- und Rechnungsmanagement
  3. Lagerhaltung und Logistik (DropShipping, Streckengeschäfte)
  4. Marketing Module wie Customer-Segmentation Building und Email- Campaigns.
  5. unfassbar viele Module mehr

Durch die Erweiterbarkeit über Module („Apps“) ist die Anzahl der auch durch die Community eingebrachten Funktionalitäten schier endlos und durch die Plug-In Architektur sauber gekapselt. Das ist auch der Hauptgrund, warum wir Odoo schon seit 6 Jahren in Produktion nutzen.

Inzwischen ist allerdings viel Wasser den Rhein heruntergeflossen und wir sind aktuell bei Version 14 – wir selber nutzten bisher noch die 7er.

Hier also ein Quick-Installation Guide, wie Ihr Odoo 14 auf einer AWS EC2 Instanz installiert.

  1. Eine T2.Micro (free tier möglich) AWS Instanz genügt, Ubuntu 20-x image
  2. Setzt die Scurity Group auf ports 80, 443 und 22. Der mitgebrachte NGINX sorgt dafür, dass der Odoo Server auf http und https ausliefert
  3. Es gibt hier ein schönes Install Script, das ihr Euch herunterladen könnt.
  4. Bevor Ihr das Install-Script ausführt, solltet Ihr die folgenden Parameter anpassen:
    3. INSTALL_NGINX = True
    5. ENABLE_SSL = true (installiert automatisch Certbot für SSL Zertifikate)
  5. Bei der Installation wird WKHTMLTODPF nicht / nicht sauber installiert – da ihr das in der Version 0.12.5-1 benötigt, ladet das Paket hier herunter und installiert es manuell.  WKHTMLTOPDF wird benötigt, wenn Ihr z.B. Rechnungen als PDF erzeugen wollt.
  6. WICHTIG: Odoo bietet ein „seltsames Feature“ im Standard an, nämlich das Backup oder Löschen von Datenbanken via Login Screen. Das Feature deaktiviert ihr, indem Ihr in die /etc/odoo-server

    list_db = False eintragt. Danach natürlich den odoo Service neu starten.


Für ein ziemlich einfaches aber hocheffektives DB Backup, gibt es das AutoPostGreSQLBackup Script von k0lter, inkl:

  1. Email notification
  2. Compression
  3. Encryption
  4. Rotation
  5. Databases exclusion
  6. Pre and Post scripts

Einfach in ein Verzeichnis konfigurieren und via rsync in ein offsite repository spiegeln.