WOMAD

WOMAD

Womad stands for World of Music, Arts and Dance, an international music and arts foundation known primarily for its festivals, held annually in multiple locations around the world.

You never know where interesting news may come from” is the concept I have learned throughout my years.

To confirm it I will tell you that the first time I found information about Womad I was reading with my son from his English school book.

We owe the idea of this music art and dance festival to Peter Gabriel

 

In 1982 the author of the immortal 7/4 song with that harmonic theme that brought everyone’s thoughts on that perfect green to the sky, together with a group of people started the first festival right in Somerset, in Shepton Mallet.

Later the project evolved into a mission: to create opportunities for cultural exchange and learning to bring the arts of different cultures to the widest possible audience by developing arts education and creative learning projects.

After all, Peter Gabriel’s spirit of contamination appeared to us loud and clear in the blending of tribal percussion and electro-synth sounds in Shock the monkey.

The resounding aspect of the Womad festival that is being held these days is that Carmen Consoli announced her participation with this picture.

 

The Singeress.

I would say she is perfect to represent the strength of roots and the richness of collaborations.

 

Carmen is the first Italian artist to perform at the world music festival.

July 27 / 30 – Charlton Park are the coordinates of Womad 2023.

While waiting, we can watch highlights from last year’s edition: the 40th.

 

 

ChatGPT

ChatGPT

GPT stands for Generative Pretrained Transformer.

Highsounding and even somewhat disturbing terms that “extend their hand” introducing themselves mellowed by the chat prefix.

There is a lot of talk about this “conversational” artificial intelligence able to chat and answer in-depth questions.

The official website lists among ChatGPT’s features the ability to admit mistakes, challenge incorrect premises, and reject inappropriate requests.

All of this is done through machine learning using an algorithm trained with “phenomenological data” that is, data collected from interaction with language in a given  environment.

This algorithm is identified by another acronym: NLP short for Natural Language Processing.

Natural language would be “human” language as opposed to text data that no longer relies on predefined patterns but evolves flexibly.

Artificial Intelligence learns from us.

I don’t know about you, but I would have an immediate point to make in this regard.

OpenAI, creator of this system tells:

We launched ChatGPT as a research preview so we could learn more about the strengths and weaknesses of the system and gather user feedback to help us improve its limitations. Since then, millions of people have provided us with feedback, we have made several major upgrades, and we have seen users find value in a wide range of professional use cases, including writing and editing content, brainstorming ideas, helping with programming, and learning new topics.

Let’s try to dwell on the listed features:

Content drafting and editing: indeed, this system can write text, surely better than me who never turn out to be good to the infamous SEO analysis 🙂

Brainstorming ideas: at the level of creativity, I think to the possibility to create images by inserting only a few words.

In this sense the storm can occur with the results, as the creators themselves explain in this video

Learning new topics: it also winks at education by presenting the chances as interactive and accessible to students.

On Feb. 1, however, a “pilot subscription plan”  is released with this premise:

We love our free users and will continue to offer free access to ChatGPT. By offering this subscription price, we will be able to help support the availability of free access to as many people as possible.

But aren’t users teaching?

I was also struck by another clarification posted on the official page ChatGPT Optimizing Language Models for Dialogue, a link leads to “aligning language models” and specifies the following:

We have trained language models that are much better at following user intentions than GPT-3, making them also more truthful and less toxic, using techniques developed through our alignment research. These InstructGPT models, which are trained with humans in the loop, are now deployed as predefined language models on our API.

Less toxic … I suppose toxicity refers to how previous projects have learned even elements let’s say not politically correct.

The difference between man and machine is just that: imperfection.

Am I wrong?

Do you think we will get to the point where we will be the ones learning from AI and not vice versa?

Pin It on Pinterest