Terence Tao 

Terence Tao is another remarkable figure who deserves a Nobel Prize. However, since there isn't a Nobel Prize for mathematics, the highest honor he can receive is the Fields Medal, which he has already won.

Amadeus #Mozart, go Bob Dylan!

[1] https://en.wikipedia.org/wiki/Fields_Medal

His: BLOG | OFFICIAL | AWARD

Categories:  Articles | Personal
Friday, 28 June 2024 04:56:50 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 On Bosons and Morons 

It finally happened. Pfeew!

Article link: On Bosons and Morons

Categories:  AI | Articles | Parody
Saturday, 10 February 2024 12:10:30 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 Paradigm Shift is Happening in AI Space and Rarely One Notices 

Microsoft Copilot (AI Based, going to all Office, Bing and communication products)

Currently named Copilot for Work..

Categories:  AI
Sunday, 19 March 2023 15:28:44 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 M3GAN 

M3GAN is nice and cute.

IMDB link.

Categories:  AI
Friday, 27 January 2023 17:51:52 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 Go on, then.. #AI 

Source link: medRxiv

Quite scary.. think.

Categories:  AI
Tuesday, 17 January 2023 08:04:12 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 A Little Late Night Conversation with @OpenAI 

Copy and paste from my conversation with https://chat.openai.com.

Response time was under 20 milliseconds. Handling context with no issues. Fascinating.


> So, how are you OpenAI?
I'm doing great! OpenAI is a research laboratory dedicated to finding ways to use artificial intelligence to benefit humanity. We're making advances in machine learning, robotics, and other AI technologies, and we're always looking for new opportunities to make the world a better place.

> What's a garbage collector?
A garbage collector is a program in computer science that is responsible for managing the memory allocation of a computer system in order to ensure that it runs efficiently. It frees up memory that is no longer in use and reclaims memory that is being used inefficiently.

> Should we use it?
Yes, garbage collection is important for the efficient use of memory and can help improve the performance of a computer system. If a computer system is not using garbage collection, it’s likely that it will suffer from memory leaks, which can lead to reduced performance and system instability.

> Hmm, I should have asked differently...
No worries, we're here to help. What other questions do you have about garbage collection?

> What I meant to ask was, should we use it manually or should we leave garbage collection to platforms we use?
It depends on the platform and the application you are developing. If the platform has a built-in garbage collection system, you can usually leave it to the platform to manage memory allocation and reclaim memory. However, in some cases, it might be necessary to manually implement garbage collection in order to optimize the performance of the system.

> Could you give me an example of that?
Sure. For example, if you are developing a game, it might be beneficial to manually implement garbage collection in order to free up memory that is no longer in use and reclaim memory that is being used inefficiently. This can help keep the game running smoothly and improve the overall user experience.

> Any other examples?
Yes, garbage collection can also be used in other areas such as web development and database management. Garbage collection can help reduce the amount of memory that is used by a web application and can help ensure that a database is running efficiently.

> Any other examples when we should trust the platforms for garbage collection?
In many cases, it is best to trust the platforms for garbage collection. For example, if you are developing an application that runs on a popular platform such as Android or iOS, you can usually trust the garbage collection system that is built into the platform. This will ensure that memory is being managed efficiently and that the application is running smoothly.

> What do you mean by 'managed efficiently'?
When garbage collection is managed efficiently, it means that memory is being allocated and reclaimed in a way that maximizes efficiency and minimizes memory leaks. This helps to ensure that the application runs smoothly and does not suffer from performance issues due to memory leaks.

> You are quite knowledgeable about the subject.
Thank you for your kind words!

> What happens if memory is not reclaimed?
If memory is not reclaimed, it can lead to memory leaks and decrease the performance of the application. Memory leaks can also cause the application to crash or become unstable. To prevent these issues, it is important to ensure that memory is being reclaimed efficiently with the help of garbage collection.

> So, what did programmers do when there was no automatic garbage collection?
Before the advent of automatic garbage collection, programmers had to manually manage memory allocation and reclaim memory when necessary. This could be a tedious and time consuming process, and errors in memory management could lead to memory leaks and other performance issues.

> What do they do now in C or Lisp or Fortran?
Nowadays, many programming languages such as C, Lisp, and Fortran have built-in garbage collection systems. This allows programmers to focus more on the application logic, rather than having to manually manage memory allocation and reclaim memory.

> Which platforms or languages do have automatic garbage collection?
Many popular platforms and languages such as Java, JavaScript, Python, and more have built-in garbage collection systems. Additionally, many modern operating systems such as Windows and macOS also have garbage collection systems.

> Does .NET have it?
Yes, .NET has a built-in garbage collection system. The .NET garbage collection system is responsible for managing memory allocation, reclaiming memory when appropriate, and ensuring that the application runs efficiently.

> So.. What are generations in terms of .NET garbage collection?
Generations are a concept used in .NET garbage collection to help improve the performance of the garbage collector. The .NET garbage collector divides objects into three generations based on their lifetime, and objects that have been alive for longer are placed into higher generations. This helps the garbage collector to identify which objects are the most likely to be reclaimed, allowing it to reclaim memory more efficiently.

> Tell me about latency models in this case.
Latency models are used in .NET garbage collection to help reduce the impact that garbage collection has on the performance of the application. The latency model estimates the time it will take for the garbage collector to reclaim memory, and then it schedules garbage collection at times that will have the least impact on the application. This helps to ensure that the application runs smoothly, even when garbage collection is running.

> Any thoughts on weak references for GC?
Weak references are a type of reference that can help the garbage collector reclaim memory more efficiently. Weak references are references to objects that are no longer strongly referenced, and they allow the garbage collector to reclaim the objects that they refer to even if they are not strongly referenced. This can help reduce memory leaks and improve the performance of the application.

> You might just pass a Turing test. What is it?
The Turing test is a test of a machine’s ability to exhibit intelligent behavior that is indistinguishable from that of a human. It was proposed by Alan Turing in 1950, and it is still used today to evaluate the performance of AI systems.

> Could you pass it?
My understanding is that I am not the one who is being tested in the Turing test - it is the machine that is being tested. However, it is possible that I could pass the Turing test if I was able to exhibit intelligent behavior that is indistinguishable from that of a human.

> So coming back to GC.. Could you explain the concept of compaction?
Compaction is a process used by the garbage collector to reclaim memory more efficiently. The garbage collector moves objects around in memory to free up memory that is no longer in use and to reduce the amount of memory fragmentation. This helps to reduce memory leaks and improve the performance of the application.

> No, GC as in garbage collection...
Ah, of course! Garbage collection is a process used in computer science to manage the memory allocation of a computer system in order to ensure that it runs efficiently. It frees up memory that is no longer in use and reclaims memory that is being used inefficiently.

> Giving you an option to select between a forced and optimized garbage collection. When would you choose one or the other?
The choice between forced and optimized garbage collection depends on the application and the expected workload of the system. Forced garbage collection is used when the application needs to reclaim memory immediately and the system can tolerate a slight decrease in performance. Optimized garbage collection is used when the application can wait for the garbage collector to reclaim memory and the system's performance needs to be maximized.

> Quite good.
Thank you!

> Could we change topics?
Of course! What would you like to talk about?

> How much power does a 12kW power outlet generate in 12 hours?
A 12kW power outlet will generate 144 kWh (12 kW x 12 hours) of electricity in 12 hours.

> What about 120 hours?
For 120 hours, the 12 kW power outlet will generate 1,440 kWh (12 kW x 120 hours) of electricity.

> So, a 700MW power plant generates how many times more power that this power outlet?
A 700 MW power plant will generate 4,800,000 kWh of electricity in 12 hours (700 MW x 12 hours), which is 33,333 times more power than the 12 kW power outlet.

> What are typical examples of such a powerful power plants? Do they run on coal?
Yes, many large power plants are powered by coal. These power plants are typically used to generate electricity for large cities and regions. Examples of such power plants include the Taichung Power Plant in Taiwan, which has a capacity of 5,500 MW, and the Four Corners Power Plant in the US, which has a capacity of 3,520 MW.

> What about nuclear, which one generates the most power?
The Kashiwazaki-Kariwa Nuclear Power Plant in Japan is the largest nuclear power plant in the world, with a capacity of 7,965 MW. It is followed by the Fukushima Daiichi Nuclear Power Plant in Japan (4,711 MW) and the Bruce Nuclear Generating Station in Canada (4,400 MW).

> And in Europe?
In Europe, the largest nuclear power plant is the Kola Nuclear Power Plant in Russia, with a capacity of 4,800 MW. It is followed by the Cattenom Nuclear Power Plant in France (3,960 MW) and the Olkiluoto Nuclear Power Plant in Finland (2,800 MW).

> And in Slovenia?
In Slovenia, the only nuclear power plant is the Krško Nuclear Power Plant, with a capacity of 688 MW.

> Austria?
In Austria, there are no nuclear power plants.

> Ukraine?
In Ukraine, the largest nuclear power plant is the Zaporizhia Nuclear Power Plant, with a capacity of 6,000 MW.

> You just said Kola nuclear poweplant is the largest in Europe. What gives?
The Kola Nuclear Power Plant in Russia is the largest nuclear power plant in Europe in terms of capacity, but it is not located in Europe itself. It is located in the Murmansk region of Russia, which is east of Finland and Norway.

> So, how many reactors does Zaporizhia consist of?
The Zaporizhia Nuclear Power Plant consists of six nuclear reactors.

> Each generating how many megawatts?
Each of the six reactors at the Zaporizhia Nuclear Power Plant generates 1,000 MW of electricity.

> Thank you. You've been splendid.
You're welcome! It was my pleasure to help.

Categories:  AI
Wednesday, 07 December 2022 01:52:40 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 Bing Daily Image - A Bing Image of the Day Download Tool 

Ever wanted to have your Windows desktop background look sharp? Like, Bing Gallery sharp? Let me help you.

Here's a little tool I'm releasing today that allows you to:

  • Get daily Bing Image of the Day to your desktop
  • Perform bulk downloads, multiple images at a time
  • Add image title and description to your wallpaper [1]
  • Run it as a service

It's all in one tool. Small, simple, efficient.

Here's the parameter model it supports:

BingDailyImage v1.0 - Download desktop wallpaper images from Bing
Copyright © 2021, Matevž Gačnik
www.request-response.com

Gets Bing Picture of the Day images for today and a specified number of days back.

Usage: BingDailyImage [-c] [-d [days]] [-f folder] [-t [top|bottom]] [-b]
   -c             Get current Bing image
   -d [days]      Specifies number of days to fetch.
                  If you omit this parameter the tool will download
                  last two weeks (14 days) of Bing wallpapers.
   -f             Set download folder
                  If you omit this parameter the folder will be
                  set to  - '%USERPROFILE%\Pictures\Bing Wallpapers'.
   -t             Add text (image title and description) to images
                  You can specify text position [top, bottom]. Default is bottom.
   -b             Set last downloaded image as desktop background
   -s install     Installs BingDailyImage as a system service
                  Use -f to specify service download folder path
                  Use -t to let service add text to images
   -s uninstall   Uninstalls BingDailyImage as a system service
   -s start       Starts BingDailyImage service
   -s stop        Stops BingDailyImage service
   -s query       Queries BingDailyImage service state
   -h             Displays help

You can just do a BingDailyImage.exe -c to get the current daily image. By default, it will not tamper with background images, so you'll get the highest resolution available (1920x1200 or 1920x1080), like this:

BingDailyImage v1.0 - Download desktop wallpaper images from Bing
Copyright © 2021, Matevž Gačnik
www.request-response.com

Downloading Bing Image of the Day for 2021-12-16.
Image date: 2021-12-16
Image title: Old Town in Salzburg, Austria
Image description: When it's lit up like this with a cozy glow, we can admire… When there's a mountain in your city… We're looking at the Old Town portion
of this Baroque city…
Downloading background... Background for 1920x1200 found.
Saving background... Done for 2021-12-16.

Or do a BingDailyImage.exe -d 10 -t to get the last 10 and add a nice, transparent background text to them.

Hell, do a BingDailyImage.exe -s install and forget about it. It's going to download new images once they are published to Bing's servers. All you need to do now is set your Windows 10/11 desktop background to be fetched from the download folder. Done.

Here's the download.

Here's a sample of a downloaded image.



[
download original image]

[1] You might be surprised about the fun facts you'll learn every day.

Categories:  Other | Personal | Windows 10
Wednesday, 01 December 2021 19:05:46 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 Clouds Will Fail 

This is a slightly less technical post, covering my experiences and thoughts on cloud computing as a viable business processing platform.

Recent Amazon EC2 failure was gathering a considerable amount of press and discussion coverage. Mostly discussions revolve around the failure of cloud computing as a promise to never go down, never lose a bit of information.

This is wrong and has been wrong for a couple of years. Marketing people should not be making promises their technical engineers can't deliver. Actually, marketing should really step down from highly technical features and services, in general. I find it funny that there is no serious marketing involved in selling BWR reactors (which fail too), but they probably serve the same amount of people as do cloud services, nowadays.

Getting back to the topic, as you may know EC2 failed miserably a couple of weeks ago. It was something that should not happen - at least in many techie minds. The fault at AWS EC2 cloud was with their EBS storage system, which failed across multiple AWS availability zones within the same AWS region in North Virginia. Think of availability zones as server racks within the same data center and regions as different datacenters.

Companies like Twitter, Reddit, Foursquare, Tekpub, Quora and others all deployed their solutions to the same Amazon region - for example - North Virginia and were thus susceptive to problems within that specific datacenter. They could have replicated across different AWS regions, but did not.

Thus, clouds will fail. It's only a matter of time. They will go down. The main thing clouds deliver is a lower probability of failure, not its elimination. Thinking that cloud computing will solve the industry's fears on losing data or deliver 100% uptime is downright imaginary.

Take a look at EC2's SLA. It says 99.95% availability. Microsoft's Azure SLA? 99.9%. That's approximately +- 7 hours of downtime built in! And we didn't even start to discuss how much junk marketing people will sell.

We are still in IaaS world, although companies are really pushing PaaS and SaaS hard. Having said that, Windows Azure's goal of 'forget about it, we will save you anyway' currently has a lot more merit that other offerings. It is indeed trying to go the PaaS and SaaS route while abstracting the physical machines, racks and local/private datacenters.

Categories:  Architecture | Other
Saturday, 04 May 2019 21:02:27 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 P != NP Proof Failing 

One of the most important steps needed for computer science to get itself to the next level seems to be fading away.

P vs NP

Actually, its proof is playing hard to catch again. This question (whether P = NP or P != NP) does not want to be answered. It could be, that the problem of proving it is also NP-complete.

The (scientific) community wants needs closure. If P != NP would be proven, a lot of orthodox legislature in PKI, cryptography and signature/timestamp validity would probably become looser. If P=NP is true, well, s*!t hits the fan.

Categories:  Other
Sunday, 19 August 2018 21:31:28 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 Debugging Windows Azure DevFabric HTTP 500 Errors 

While developing with Windows Azure SDK and local Azure development fabric, when things go wrong, they go really wrong.

It could something as obscure as leaving an empty element somewhere in Web.config or a certificate issue.

The problem is, that before Visual Studio attaches a debugger into a IIS worker process, a lot of things can go wrong. What you get, it this:

There's no Event Log entry, nothing in a typical dev fabric temp folder (always, check 'C:\Users\\AppData\Local\Temp\Visual Studio Web Debugger.log' first), nada.

Poking deeper, what you need to do is allow IIS to respond properly. By default IIS only displays complete error details for local addresses. So, to get a detailed report you need to use a local address.

You can get a local address by fiddling with the site binding with IIS Manager and changing what Azure SDK does, so:

  • First, start IIS Management Console.
  • Then right click on your deployment(*).* and select Edit Bindings.
  • Select All Unassigned

If you hit your web / service role using the new local address (could be something like http://127.0.0.1:82), you are most likely getting the full disclosure, like this:

In this case, a service was dispatched into dev fabric, which had an empty element somewhere in Web.config. The web role was failing before Visual Studio could attach a debugger, and only HTTP 500 was returned through normal means of communication.

Categories:  .NET 4.0 - WCF | Windows Azure
Monday, 10 April 2017 15:04:49 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 The Case of Lightweight Azure MMC Snap-In Not Installing on Azure SDK 1.6 

There are a couple of Windows Azure management tools, scripts and PowerShell commandlets available, but I find Windows Azure Platform Management Tool (MMC snap-in) one of the easiest to install and use for different Windows Azure subscriptions.

The problem is the tool has not been updated for almost a year and is this failing when you try to install it on the latest Windows Azure SDK (currently v1.6).

Here's the solution.

Categories:  Windows Azure
Saturday, 26 March 2016 11:36:53 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 The Case of Guest OS Versioning in Windows Azure 

There's a notion of Windows Guest OS versions in Windows Azure. Guest OS versions can actually be (in Q1 2012) either a stripped down version of Windows Server 2008 or a similiar version of Windows Server 2008 R2.

You can upgrade your guest OS in Windows Azure Management Portal:

Not that it makes much difference, especially while developing .NET solutions, but I like to be on the newest OS version all the time.

The problem is that the defaults are stale. In 1.6 version of the Windows Azure SDK, the default templates all specify the following:

   xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration"
   osFamily="1"
   osVersion="*">
     

The osFamily attribute defines OS version, with 1 being Windows Server 2008 and 2 being Windows Server 2008 R2. If you omit the osFamily attribute, the default is 1 too! Actually this attribute should probably move to the Role element, since it defines the version of the role's guest OS.

   xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration"
  
osFamily="1"
   osVersion="*">
 
   
    

 
 
   
    
 

It doesn't make sense to have it normalized over all roles. Also, this schema makes it impossible to leave it out in VM role instances, where it gets ignored.

The osVersion attribute defines the version which should be deployed for your guest OS. The format is * or WA-GUEST-OS-M.m_YYYYMM-nn. You should never use the latter one. Asterisk, normally, means 'please upgrade all my instances automatically'. Asterisk is your friend.

If you want/need Windows Server 2008 R2, change it in your service configuration XML.

What this means is, that even if you publish and upgrade your guest OS version in the Azure Management Portal, you will get reverted the next time you update your app from within Visual Studio.

Categories:  Windows Azure
Sunday, 31 March 2013 21:40:01 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 Bleeding Edge: Pub/Sub Broker Design 

This is the content from my Bleeding Edge talk on pub/sub broker design and implementation.

Due to constraints of the project (European Commission, funded by EU) I cannot publicly distribute the implementation code at this time. I plan to do it after the review process is done. It has been advised, that this probably won't be the case.

Specifically, this is:

  • A message based pub/sub broker
  • Can use typed messages
  • Can be extended
  • Can communicate with anyone
  • Supports push and pull models
  • Can call you back
  • Service based
  • Fast (in memory)
  • Is currently fed by the Twitter gardenhose stream, for your pleasure

Anyway, I can discuss the implementation and design decisions, so here's the PPT (in slovene only).

Downloads: Bleeding Edge 2011, PPTs
Demo front-end: Here

Wednesday, 05 October 2011 10:17:31 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

 The Case of Empty OptionalFeatures.exe Dialog 

The following is a three day saga of empty 'Turn Windows Features on or off' dialog.

This dialog, as unimportant as it may seem, is the only orifice into Windows subsystem installations without having to cramp up command line msiexec.exe wizardry on obscure system installation folders that nobody wants to understand.

Empty, it looks like this:

First thing anyone should do when it comes to something obscure like this is:

  1. Reinstall the OS (kidding, but would help)
  2. In-place upgrade of the OS (kidding, but would help faster)
  3. Clean reboot (really, but most probably won't help)
  4. Run chkdsk /f and sfc /scannow (really)
  5. If that does not help, proceed below

If you still can't control your MSMQ or IIS installation, then you need to find out which of the servicing packages got corrupted somehow.

Servicing packages are Windows Update MSIs, located in hell under HKLM/Software/Microsoft/Windows/CurrentVersion/Component Based Servicing/Packages. I've got a couple thousand under there, so the only question is how to get to rough one out of there.

There's a tool, called System Update Readiness Tool [here] that nobody uses. Its side effect is that it checks peculiarities like this. Run it, then unleash notepad.exe on C:\Windows\Logs\CBS\CheckSUR.log and find something like this:

Checking Windows Servicing Packages

Checking Package Manifests and Catalogs
(f) CBS MUM Corrupt 0x800F0900 servicing\Packages\
Package_4_for_KB2446710~31bf3856ad364e35~amd64~~6.1.1.3.mum  Line 1:

(f) CBS Catalog Corrupt 0x800B0100 servicing\Packages\
Package_4_for_KB2446710~31bf3856ad364e35~amd64~~6.1.1.3.cat  

Then find the package in registry, take ownership of the node, set permissions so you can delete and delete it. Your OptionalFeatures.exe work again and it took only 10 minutes.

Categories:  Other | Windows 7 | Work
Tuesday, 07 June 2011 07:57:17 (Central Europe Standard Time, UTC+01:00)  #    Comments

 

Copyright © 2003-2024 , Matevž Gačnik
Recent Posts
RD / MVP
Feeds
RSS: Atom:
Archives
Categories
Blogroll
Legal

The opinions expressed herein are my own personal opinions and do not represent my company's view in any way.

My views often change.

This blog is just a collection of bytes.

Copyright © 2003-2024
Matevž Gačnik

Send mail to the author(s) E-mail