Friday, October 9, 2020

How to write you own Serverside-Extention!

Serverside Extention (IIS or Apache)!

For many years I have my own webserver. In my case, this always was a Windows Server, because I want to do my serverside scripts in Delphi! Apart from that, I had no clue about a Unix/Linux Server!

Being part of the Apocalypse Coding Group opens this door and in many hours of development of our little game, I could prove, that the same ISAPI.dll could also be compiled and executed on a Ubuntu Apache Server! 

I love Delphi. 

Building a new website or web service in the past, I had to make a choice:

For the website:

1. ASP.NET (Too bad only up to D2007)

After using ISAPI.dll's for a long time I've more often used ASP.NET. The code-behind for a webpage is not locked by the server, so it was easy to update the DLL-files. This was also possible by using the well known "egg"-loader. Updating the server to a newer version suddenly prevented the loader to work and I had never the time to find the reason for this. That why I preferred ASP.NET and for the other ISAPI.dll's I always hat to start and stop my IIS-Server.

For the web service:


SOAP was the way to go because it was so easy to use. Just define your Interface and use it as it is local in your application. The only drawback is the huge data that is transferred with every call. That's why I used this not in the "native" way but just to transport an encrypted, compressed binary stream.

This was the past... But as time goes by...

If your traffic is increasing and you want to do many more things with your server, there comes a time when you realize that a 2 core server with an ancient operating system is just not up to date anymore.

Virtualization is one way to go. In my case, I use Proxmox on a beefy 2 CPU, many cores,  many RAM Server with a RAID SSD drive... Nice...

This server handles two or more Windows 2019 Server and many Ubuntu Servers for small web- or microservices.

Of course, for web-services "we" are doing JSON/RESTfull web services these days. So what is needed to do all the fancy stuff? 

ISAPI.DLL's - best performance - native execution, no script interpretation. For a web-site there are two ways to go: (perhaps many more, but not for me)

1. Uses a "normal" HTML page and  AJAX background calls to get the dynamic content you need for your site.

2. Use the Webbroker implementation and deliver the content directly from your DLL. This content could also do REST-calls to get more data from the same or a different service.

Perhaps you also want some URL-Rewrite or another Server-side handling, in this case, you also need a Filter.dll.

This time I'll do it better...

With the newly installed Windows 2019-Server, I tried the simple Hello-World DLL that is included with the egg-loader. But the Server was unable to execute this little DLL. But a request to the URL just downloads the DLL. This was the same with other DLL's I just copied from the old server. (The Hello-World.DLL works without problems on the old server). After some googling. I found the "uses 32 bit - DLL's" setting. But with this setting, I got a 500 with a weird error message. Setting the error level to detailed brings more information. Spoiler: Do not google these errors! Everything you'll find is nonsense! Like: You have to disable compression or other things. Most of the ideas lead you to the Handler settings - also wrong. (of course, you have to install all the ISAPI IIS extensions, I assume you have done this). So what is the difference between the old and the new server? 


OMG... Why is there a setting for 32-Bit DLL's if you have to use 64-Bit anyway? OK - Start 10.4.x, compile it to 64-Bit and try again. To bad the Loader-Version 2.0 is from 2005 and not Unicode. After some fixes - the same problems. Forgot to disable the 32-Bit setting. The Hello-Worlds.DLL brings no errors anymore but is download again with every call. The loader is "kind of " working. All log files have wide-strings. Unreadable. Again some Unicode-Fixes and recompiled the Hello-World.dll to 64-Bit...

And here we go... a nice "Hello" displayed in the browser. But the Egg-Loader is unable to update the *.run version with the *.update version. I think the IIS is still holding the DLL. (How is this possible). So I wrote an eMail to William Egge if he has any ideas... 

This error is impossible. The loader is working, the hello-world is working, a free library should also work and the IIS should have no knowledge about the site-loaded DLL. Why is the loader unable to rename the files?

OMG... Again... No rights! 

By giving the IIS-User the access rights everything is working perfectly! Of course, I wrote to William that everything is working now... Answer from Bill: "Ok Cool 😊"! Perhaps I will send him the changes or I will write a different version, that is also able to handle Filter.dll's. Not decided yet!

So here is the workflow:

1. After installing the Windows-Server and all necessary extensions.
2. Create a website!
3. Compile your WebBroker.DLL (64-Bit or matching the Server)
4. Create a directory outside of your web-page-path. (e.g. c:\API / c:\ISAPI / c:\Script)
5. Create a minimal web.config in this directory. (for debugging errorMode="Detailed")

<?xml version="1.0" encoding="UTF-8"?>
        <handlers accessPolicy="Execute, Script" />
        <directoryBrowse enabled="false" />
        <httpErrors errorMode="Detailed" />

6. Upload the DLL to this directory and rename it to (or any other name).
7. Upload the loader.dll to the directory and rename it to WebBroker.dll (Same name as your DLL)
8. Add the IIS-User with access rights to this directory
9. Add an Application to your website in the IIS-Manager. Point to the ISAPI directory.

(The Name will be part of the URL) "API" ->

10. Read is not permitted and browsing is off, but anyway I would store log files outside of this directory.

With the redesign of the egg-loader, we are back in the game!

If you want to see a YouTube tutorial of this, please leave a comment. There will be a video about this as part of my Firemonkey Development Kit and the Server-Side of my JSON-Store implementation, but this is scheduled not before I've completed my #D.MVVM Framework series and more basic stuff of my FDK.

If you want a notification on new videos, please subscribe to my channel and hit the bell icon! Thank you, this helps to grow my channel!

Tuesday, September 29, 2020

Workflow and multi - binding with #D.MVVM - The Delphi MVVM Framework!

Workflow and multi binding with #D.MVVM!

In every example for MVVM, you find the three blocks diagram explaining the binding between the View, the ViewModel, and the Model.

If you dig deeper into a real-world application, you may use the 1:1:1 relation, but in most cases, you need it very differently.

1:1:1 means One View connected to one ViewModel, connected to one Model!

Let's start with the View:

Most of my views are not simple TForms but much more assembled on small subframes. The subframes are also TForms and not TFrames, because of the recurring problems with frames (in the past). But that is not the topic.

The View will be created in the composition root - or elsewhere - and then?

Every subframe must also be created and placed into the target frame of the parent view.

So far so good. Of course, this is all handled by my framework.

In some cases, every frame should have its own ViewModel. But what if not?

Well, keep this in mind and start at the other end of this chain...

We have a database where every row is just an entry of our beloved TPerson, with the normal address fields.

Part of this TPerson is a list of telephone numbers handled in a different table and perhaps also a list of bank accounts also in a different table.

As we all know - I hope - the Model is not our Database interface, the Model will get the interface to the database for local or remote storage. This could be an interface to a local database or to a RESTfull web service or something else!

Let us talk concretely for the moment:

To store a value into a database or send it as JSON you need the field-value. For the complete row and perhaps for the subtables you need a list of field-values and the field-names or at least the index position in the param list of the query.

We will name this: Place X.

Place X need the Name, Street, Zip and more... And of course, the two lists for the subtables.

To be able to get this right you have to collect the data. A good place would be the Model. Wait... The Model?

So for all the different fields and from at least 3 frames, we have to collect the data in only one Model!

To store the data, the database should do a transaction to keep the subtables in sync with the address table. This would be a nightmare if 3 Models would do this separately.

At this point, we have our View with 3 subframes (Address, Telefonlist, Bankaccountlist) and only one Model.

1(3) : ? : 1

The View is not the place to collect the data from the subframes. Should we bind the subframes to one ViewModel?

In this case, we get our example relation of 1:1:1. That would be nice, but will it fit our needs? One reason to separate our View into SubViews is the possibility to swap out a subview to a different version. To check if the subframe data is sufficient to be stored, I would like to have a simple ViewModel per frame. Let's collect:

PersonView -> PersonViewModel -> PersonModel <- Database/Rest Interface

AddressView -> AddressViewModel -> ?
TelefonView -> TelefonViewModel -> ?
BankView -> BankViewModel -> ?

The PersonViewModel has no fields (or just two buttons).

Even if we do not store data in visual controls, we must define the controls. Lets name this Place B,C,D for the three subviews, and Place A perhaps for the PersionView that only has a clear and save button.

In the PersonViewModel we have to declare a Field/Property canSave (Place A1). Inside the PersonViewModel we must be able to check the three other ViewModel so every ViewModel from the subviews should be connected to the "parent" ViewModel.

In the AddressViewModel we have to declare the Field/Properties for the Address. (Place B1) And the same for the two other ViewModels (Place C1, Place D1).

In a best-case scenario, we would skip the Models for the subviews - if not we would also have Place B2, C2 and D2.

Is MVVM slower in development because you have to type everything three times?

Normally: Yes, it is!

Don't forget we're doing MVVM to get things faster developed! 

How can we archive this?

One major point for MVVM is Testability. Perhaps we are slower in the first place, but with a huge project and a lot of Models - if we have really good test coverage - we are much faster than other projects after a short period of time. Especially if you're doing TDD. 

If you haven't seen my first two videos on YouTube, please take a look. And also please subscribe to help me grow my channel!

My idea is auto bind by name, this saves you a lot of time. The other idea is now to store field values on the flow from component to the final database.

Let's take a look at the full development cycle and for simplicity name it in Delphi terms:

Create a TForm for the subframes, name it TPerson.View
Create a TForm for the address, name it TPersonAddress.View
Create a TForm for the telephone numbers, name it TPersonTelephone.View
Create a TForm for the bank accounts, name it TPersonBank.View

In the composition.root: Register the View's and implement the create for the Forms.

Create the 4 ViewModels.

As the ViewModels are "just" classes derived from TViewModel, we only need to implement a property for all components on the TForms. This is the place where we would like to store our values. To speed up things, we don't create local fields as 

FName : String;

we create this local field as

FName : TProperty<String>;

Auto-Binding will do the rest and connects the Name : TEdit from the Form to this "property"!

Create an empty TPerson.Model as TPersonModel = Class(TModel).

At this point - if you're doing TDD - I would create a TPerson.Model.Test.pas and also a TPerson.ViewModel.Test.pas. But this is a topic for a different blog post.

The base TViewModel class has a procedure call ViewModelSet that is called if the ViewModel is set to the View.

If this event is fired, because you set the property in the composition root like:

Result := TPersonView.Create;
LViewModel := TPersonViewModel.Create;
LViewModel.Model := TPersonModel.Create(TPersonDB.Create('local'));
Result.ViewModel := LViewModel; // here

By setting the ViewModel Property at the View, the Framework is doing the auto depending subframe load. So if everything is ready to go you have the ViewModelSet method that is called to setup the multi bind.

So we have:
1(3) : 1(3) : 1 chain!

If our MainView / MainViewModel wants to create our PersonView it just calls "Show('PersonView')" as part of the Navigation Service - Bang - everything will be created and everything is wired-up for the View and the subviews.

Now - how to transfer the Data from the View to the ViewModel to the Model to the DB and even from SubViewModels? Sounds complicated!

Let's compare RAD vs. MVVM!

In a "normal" RAD App you will have to define some kind of data structure to keep your data in memory. This could be a Class or a Record with all your field-values. In our case, there will also be some kind of TList<TBank> or TArray<TTelephone>. To get your data into your data structure you can use the live bindings, to a dataset or in the old days perhaps a DBEdit to do this the direct way. Really horrible, because while viewing the data, the database connection must be kept open or copied to a mem-table. Perhaps you have done it the old way by manual setting the values to the component and back by calling a method like:

FormToData, DataToForm where you have to set every field to the component.

EditName.Text := Data.Name;
EditStreet.Text := Data.Street;

// ---

Data.Name := EditName.Text
Data.Stree := EditStreet.Text;

Even if you use the live bindings this could get messy if you have many controls on a Form, and of course, it is horrible to maintain, because all changes are in the DFM/FMX file not so easy to merge in your repository.

Designing the Form is the same for RAD and MVVM.
Instead of creating a Class to hold your data, in MVVM, you create a ViewModel. (Perhaps some tutorial says that the data is stored in the Model, but this is not my approach). Creating the ViewModel should take the same time as creating your Memory-Class. The auto-bind will be faster the every other RAD approach.

1:0 for #D.MVVM.

Autoload of depending Frames would be 2:0 for #D.MVVM, but you could use real frames so this would not count. But setting up subframes in the composition root is easier to maintain, so let's call it even and don't count this. So we are still at 1:0.

Using the ViewModel for a Unit-Test to check if the Form works as expected: 2:0 for #D.MVVM. This should count 3:0 because Unit-Test with the RAD-Form is not really an option.

So for the design part, we are faster with #D.MVVM than with the pure RAD approach. 

And this is where the problems begin: We don't want to redefine all the fields in the model and thus double our effort and lose our time advantage. We also want to save ourselves the typing work and don't want to double our source code, which we have to maintain in parallel. But in some cases, we have to convert fields to be able to write them correctly into the database. The same is with the database itself. We don't want to create a separate definition for our database by copying the field from the ViewModel. 

Perhaps you use a tool to design your database or some kind of Modelmaker stuff to create your DB and Classes! From my point of view, this is not the way to go. I want to have everything in the source to keep track of it over the source repository.

How can we archive this?

Well, one idea is to use an on-the-fly created data context that is able to receive or set the property fields in the ViewModel and also convert the data-fields while sending them from the ViewModel to the Model and after that to our database interface. If we finally have a wizard in place for creating ViewModels and Models insight the IDE, it will be a piece of cake to do this. Without a wizard perhaps we have to rebuild the structure only one more time to create a universal data-structure that is able to deal with our ViewModel, our Model, and our CRUD-Database.

Without this approach let's collect our places where we have to copy all the fields. We need a copy in the Model (A2, B2, C2, D2) and in our database (at least in the SQL statement to create the DB and in every query to read or fill the values) (A3, B3, C3, D3) and as we remember Place X for the JSON.

This sounds like overkill and throws a bad light on MVVM. 4 Places to copy all fields... Horrible

That's why I do it in a different way with my #D.MVVM framework. Create your ViewModel, use the ViewModel definition, and some attributes to define an ORM-Like Model for the database and also for the Model, define some converter methods to handle or change field or field-types. That's it!

No need to define any fields in the Model: 2:0 for #D.MVVM.
Just define converter if needed: 3:0 for #D.MVVM.
Create the database with just one definition: 4:0 for #D.MVVM.

And don't forget this is VCL and FMX. The next step is a View to ViewModel wizard to convert your legacy VCL app to VCL-MVVM or FMX. 

This could all be done with the pure #D.MVVM Framework. Combining this with my FDK - The Firemonkey Development Kit, that contains many units and functions that are also usable with VCL, provides the intuitive MVVM developer with a tool that can not only create even the most complicated application with numerous interfaces in a very short time but also keep it maintainable for many years.

So stay tuned for the next #D.MVVM video on my YouTube channel!

Tuesday, September 15, 2020

Submitting new Apps to the stores!

I do not often need to update my main App in the App Store or Play Store.

But every time I have too, it's a pain in the a**.

Don't get me wrong, upload a new version of my App to google, is done in 2 minutes and the App is nearly instant online.

My daily driver is an iPhone and I also have some iPad's, but the store with all the provisioning, certificates is a pain.

But even if you've managed to get all your certificates right, there is nearly every time a new key or entry for the *.info.plist. Most of these are well handled by the IDE. 

Then you upload your binary, well and nearly every time, because of all the changes by Apple, you get a weird error. After spending hours of googling, perhaps you find out the reason.

This time it was the multitasking orientations for the iPad.

All the keys like UISupportedInterfaceOrientations~ipad with:
  • UIInterfaceOrientationPortrait 
  • UIInterfaceOrientationLandscapeLeft
  • UIInterfaceOrientationLandscapeRight

The error was missing the keys form above found key from above.


fixed the problem.

Then I saw that I've not included the 1024x1024 px icon - (because this is new).
After selecting the right icon - you have to increase the version number.

Submit and wait for the review. 

My App is in the Store since 2013 - this time they want to have a video from a physical device, showing the use of the location sensor. (Ok, this was an old  function that is no longer in use, so I disabled it)

And I've tested my App against iOS 14, to be sure that everything is still working...

App rejected because I mentioned iOS 14 in the what's new part.

So again a new Upload and a new version number! I really like my iPhone, but from the point of a developer - Android is so much easier to handle.

The new version is waiting for the review, we will see...

Monday, August 17, 2020

#D.MVVM - At what point is a framework ready for release?

Besides some bug-fixes and the typical daily stuff, my focus is still on my Delphi-MVVM-Framework. 

Why, because I want to port all my old applications to this and I promised myself not to start a new project until the framework is ready so that I don't get the idea of either using an older version of it or not using MVVM at all.

There is also a dream, that I'm able to port our 35 years old main VCL - Application from D2007 direct to 10.x - #D.MVVM - at best direct to FMX.

So, I have to make sure that I can call the framework finished, which means I also have to set the needed functions to the status "finished".

Of course, we all know: With such a project, as with any other application, there is never the status "finished". So how do I find out when I'm just "tweaking" it? Ok. first of all, I need a feature list. I would really like to have a unit test for each feature. Second I need sufficient documentation. (Oh, boy this is a breaking rule. I have no documentation at all - so far).

But what's about clean code - or at least cleaning up the code (a little bit)? Yes, this is 100% necessary. 

How do you find out, if an entry in the feature list is a really necessary feature and not some nice functionality I just want to add? I really like to read your comments about this. What are the core functions, you think must be included for a 1.0 Version of my framework?

Before we try to collect some features we should define some rules.

To implement an application with the MVVM-Pattern we want:

  1. No, or nearly non-code in our View! Except for UI stuff!
  2. Comparing a RAD-App with an MVVM-App, MVVM-App should be as easy to create.
  3. The bindings are the biggest problem, they must be as easy as possible to set up.
  4. No use of Anti-Pattern. (Some people are calling a ViewFactory already an Anti-Pattern)
  5. At best: Just create both kinds of  "Forms" and you could compile it to VCL or FMX with the same codebase.
  6. Every part of your application should be unit-testable, so no dependencies between your modules. 
  7. Don't break the layer concept.
  8. The framework should work without the need of the FDK. (I had to duplicate some stuff)
  9. D.MVVM integrates as a plugin to the FDK to benefit from all the higher-level functions.
  10. Don't use the visual live bindings.
  11. Everything should work, without clicking non-visual components or special components on a form.
  12. Naming conventions over configuration.
  13. Don't care if a C# developer thinks you are doing it wrong! 😄

Anything else?

If you know me, I really dislike clicking things "together". I like to do everything in code - except the form of course - so that every change is noticeable in the source-code repository. Perhaps I create some components at the end, to fit the needs of the RAD-Development, but that is not yet decided.

While creating a feature list, I can also create some tickets to keep track of the development process.

My D.MVVM Framework should contain:

  1. A Wizard to create a new D.MVVM Application.
  2. A Wizard to create a new View (Form or Frame).
  3. A Wizard to create a new ViewModel.
  4. A Wizard to create a new Model.
  5. A Framework (VCL & FMX) independent overload of the TForm implementation to give every Form some base functionality. (done)
  6. A central point to do all the joins or Viewchains. (done)
  7. All included VCL and FMX components should be bindable. 
  8. 3rd party components should be easy to include in the binding schema.
  9. Bindings from the View to the ViewModel should be possible without writing any line of code. (done)
  10. Bindings from the ViewModel to the Model should also be possible without any code. A value converter should be included in the data transfer event. (done)
  11. A data context should keep the data in memory to be displayed. (done, but not fully tested)
  12. Some basic services should be included. (Action-Service, Navigation-Service, Menu-Service, Dialog-Service, Time-Service) 90% done.
  13. A List-Adapter for handling huge Data in Grids, Listboxes, and Memos. (done)
  14. Public Unit-Test with 100% code coverage. (Some are done, but I have not measured the coverage)
  15. VCL and FMX demo Apps. (some are done)
  16. Documentation.
I think this is not the final version 1.0 list yet, but almost.

Again: After watching Uncle Bob's clean code lessons - I have to do some heavy refactoring...

If you have any suggestions for me, I would be very happy about your comment on this topic!

If you're reading my blog for the first time, please take a look at my Youtube Channel for my already released D.MVVM-Videos.

Please don't forget to subscribe and hit the like button to help me grow my Channel.

"New" Video about SQLite in Threads is online!

Ok, of course, this Video is not really new...

I just could not find the time to do the translation and record all the translated voice-over.

This is the second video of my German CodeRage 2020 session.

In the German language, the same sentence take sometimes twice the time to say, comparing to the English version. That's why the Videos has some "silent" parts.

This is the second time I've tried to use the Speechelo AI engine to create the English voice-over! It's not so bad and since I haven't gotten any bad feedback by using this engine to produce a video, It looks like you agree.

But I think next time I do it by my self again.

Producing an English video and after that translate it to German is much easier than the other way around.

So have fun.

Tuesday, August 4, 2020

Unicode migration is harder as everybody told you - if you are old.

Harder? Why harder? Everybody is telling you - it's easier as you think.

With every new Delphi Version there is this kind of movement:

Do your Unicode conversion today and use our new Rad-Studio-Version to do your work even better as before. ( Combined often with a special offer )

I fully understand this and, to be honest, this is true and it's also a good idea. Using old Delphi-Versions is horrible. Yes, the old versions are a little bit faster and perhaps the IDE is a little bit more stable. But this is a bad tradeoff because you miss all the new stuff that makes your life so much easier.

Ok, if everything is so fine and easy, why this blog post?

If you have old source code in your legacy application and the oldest unit is created after the year 20xx, you probably have no problems. Because you've used the Rad approach and have clicked DB Components on your form direct bound to your database. Perhaps you used the BDE, but converting this to Firedac is not so hard.

So far so good... Buy the latest Delphi Version to go ahead...

But if you're old and you really learned how to code in the old days with Turbo-Pascal, you are not using DB-Components at all. 

You are using a Record to hold your data. The records had to be aligned by byte or had to be marked as Packed Record.

You store a date in 3 bytes and of course, you had to use typed string to match your needs. 

In these days you had this kind of declaration because every byte counts:

   Str6  = String[ 6];
   Str26 = String[26];
   Str40 = String[40];
   Str80 = String[80];

   TAddress = Packed Record
     FirstName : Str26;
     LastName  : Str26;
     Street    : Str80;
     Zip       : Str6
     Town      : Str80;

  Address : TAddress;

With this definition, you just wrote an address to disk by using blockwrite.

This kind of code works with TP 1 up to Sydney. So what is the problem?

Changing these (short)Strings to a Unicode-String or even to an AnsiString, You are unable to write them to disk anymore because a long string is only a reference, not the actual data. No Problem you just have to serialize the data for read- and write to stream. Each dataset in this stream then has a different length and you have to create a jump table to find the record starting pos. This is the best point to store your data not in a flat-file anymore.

Let's assume you have managed this in your whole app...I still haven't yet.

Long before I had converted my source code to able to compile with XE, I thought it is a good idea to convert every String to an AnsiString, every Char to an AnsiChar and every PChar to an AnsiPChar. 

(At this point I was not aware, that the VCL is not working like the Windows API where nearly every function has an A and W version for Ansi and WideStrings.)

So each

Label1.Caption := Address.Firstnames;

produces a warning while compiling. The Compiler is doing the trick here to convert a short string to the Unicode-Caption-String - so it works. btw. this is called a possible solution in most whiter papers. It is not - Yes it compiles and yes it runs, but the warnings are the problem, we will see later.

Again where is the problem? I call it the "Var Param Problem". 

Imagine you have the record from above. Then you probably have some methods to deal with this data structure, like:

Function MakeName(var Firstname, LastName : Str26) : Str80;

Using var was a good idea and not to copy 54 bytes on the stack.

Perhaps the MakeName function calls another method inside:

Function Crunch(var S : Str26) : Str26;
  While (S[0] > #0) and (S[byte(S[0]) = ' ') do
    S[0] := Chr(Byte(S[0]) - );

  Crunch := S;

Function EmptyName(var S : Str26) : boolean;
  Emptyname := (Crunch(s) = '');  

Function MakeName(var Firstname, LastName : Str26) : Str80;
  if EmptyName(Firstname) 
    then MakeName := crunch(LastName)
    else MakeName := crunch(LastName) +', '+crunch(Firstname);

Yes, if you did not know this - before RESULT was the way to go, assigning the value to the function name was the way you had to write functions!

There where more stupid things we had done in the old days.


Kills the reference and produces a memory leak if you have a long string in the record.

Move(Address, OldAddress, sizeof(Address));

Moves the reference and doesn't create a copy of it.

With hundreds of Methods, using short or special short string types as var params all over your source code, there are no easy ways to convert all of this step by step to a different string type. Once you have changed one function that is called by nearly everybody you get this snowball effect of compiler errors.

Yes, most of the conversion beside the "Var Param Problem" is done by the compiler. The problem is: You will get thousands of warnings and you can not ignore them, because some of them you have to deal with. 

The hard task is: Find the 500 or more problems out of 60.000 warnings. I've started with XE2 and by ignoring most of the "probably lost" warnings because you assign a string to a short string, I'm down to ~7500 warnings.

But here are some good news. With 10.4 we get new managed records that can have an assign and copy method. This is the first time there is a possibility to handle long strings in records the right way by creating a new instance on copy.

It would be so perfect if we had a compiler switch to turn of Unicode in XEx - sorry just dreaming.

While refactoring the code to XE - of course, everything has to be binary matching 1:1, so every not converted data-structure still written to disk is still the same data. By using any sourcecode repository you are able to maintain both trees (Unicode and none Unicode) for a small-time until the conflicts are too much. So the only way to go is to be able to make changes to your non-unicode source, which produces the same results with fewer warnings if you take the XEx compiler. By going this way, you want as little as possible IFDEF's in your source, because you've already a big pile of code to refactor and IFDEF's in the long term makes it more unreadable.

So, how to find a way to go?

It would be perfect if I could keep the short strings in my records as long as possible. Outside of these records, everything could be "normal" string.

- impossible

Everything is calling everybody and nearly every unit is linked to every unit - not always in a direct way, but over some unit in most cases.

Side note: Don't watch Uncle Bob's Clean Code stuff, because after that you will hate your work of the past 35 years much more you already do. (were young we needed the money)

No this is wrong: EVERY Developer should watch this Session 1-6 from Uncle Bob.

Stop reading here and click on this link and after that - if you still think, you do not need to write unit-tests. Sorry, in this case, I can't help you at all. (The only possibility is: You have skipped some parts of the videos).

If I had a trustable coverage of unit tests already in place in my legacy app, EVERYTHING would be so much easier and no fear of changing some of the old methods and dependencies.

And now? I've started a different approach - by using my source-code-tokenizer from my sourcecode formatted project, I was able to find all dependencies in all my Unit-Uses and removed ~400 Units that are not needed anymore in "this" unit. Not bad, but not enough.

I plan to restructure my dependencies with a brute force or NN network - we will see if this could work. The next thing would be a "path through the source" finder, to convert all Var-params from short to string where possible.

So a long way to go and in the end, we are on VCL 32Bit. 64Bit would be the next step. If everybody is going the ARM-Ways perhaps we have to convert to FMX some time in the future.

To get something done, I'm going to ways, first decoupling and writing unit tests, and second try to get rid of the short string or at least the none critical warnings.

And of course everything just in my spare time. I wish I had...

If you have any good ideas - I like to read them in the comments.

So long... Happy coding and watch Uncle Bob!

Friday, July 24, 2020

My Fight for FMX!

If you know me or if you read my blog, you probably know, that I stand 100% behind FMX.
Well and I'm "into FMX" since XE2, but let's say with XE6 or better XE8 it was really useable for every kind of development, desktop, and mobile.

For many years I felt a little bit lonely in this field of development, but since XE10 the club of FMXer is growing faster and faster.

Finally one of the VCL Component developers is using the FMX road, too!

So please clap your hands and welcome DevExpress to the club.

Fun fact: The blogpost about the new DevExpress FMX Grid CTP is posted at/from the VCL Team with - still the VCL logo.

They also promise to include "every single VCL product" in their FMX offering... I think this is a term for: "We will provide an FMX Version of all our components".

So to all haters who haven't taken FMX seriously and/or my love for it for years, I would like to answer with a quote from the blogpost of DevExpress: 

"If everyone moves to FireMonkey, we’ll be sure to follow."

Have a nice weekend!

PS.: If you want to start with FMX, don't forget to buy my Firemonkey Development Kit.

Saturday, June 13, 2020

CodeRage 2020 - Quickinfo...

Oh boy... Why has it been always so hard to do the easy stuff?

My Idea was - just make to sessions, do some advertising on a sublevel and I'm done...

But for session one, I had to explain so many things and create "some" screen-shots. 
Normally a 45 min. video takes ~12h to create. Sometimes a little more, if I have to develop the examples first.

I'm getting faster on "the creating video task", but this time...

The first 10 minutes cost me 4 days. Perhaps I'm doing something wrong.

And session 2?

My estimates where "double the time" from session one... I did not even start the creation. 

It's time to speed up.

I'm also getting faster for the "two audio tracks" thing. Synchronizing the English and German audio is not a big deal anymore. (OK, it still cost 2,5x the time comparing to a single audio track, but faster than doing the disturbing subtitles)

I hope you will like the two sessions and the work was not in vain or different than you expected from the title.

So don't miss it on the 2nd of July! (English versions of my videos are online while streaming the German Versions).

PS.: Of course if you're are a user of my FDK - all the stuff that is shown in the video doesn't need to interest you, because many more sophisticated routines are included in my framework. Perhaps take a look anyway, perhaps just for fun.

Monday, May 25, 2020

German CodeRage 2020

The German CodeRage will be live on the 2nd of Juli. I will present two sessions.

  1. 09:00 UTC Threads and Queues (11:00 MEST)
    How can I accelerate my application by using Queues and Threads to execute some workloads in the background? My session will show some examples of how to include this kind of asynchronous data processing in your app.
  2. 16:00 UTC SQLite in Threads (18:00 MEST)
    How can I use threads to access an SQLite-Database? Spoiler: you can use the technics from session one.
If you want to see my session live at the CodeRage event, you should register and you can also be part of the Q&A session.

If you prefer to watch the session in the English language, it will be online on my Youtube Channel at the same time!

So stay tuned and please subscribe to my channel perhaps you will find some teaser ;-)!

Monday, May 11, 2020

Delphi 10.4 Sydney - #Delphi104 - #ComingSoon

Hello, my friends!

Yes, I've got permission to blog about the upcoming new version of Delphi 10.4 Sydney!.

First I want to answer the most urgent question for all FMX-Mobile developers:

What about ARC?

It's gone, it's history, I hope I will never see any kind of ARC again!

Next on my list is Metal... No more OpenGL-(ES) on iOS.

It has a nice new feature: You can set your framerate, e.g. fix to 60FPS or only refresh the screen if something has changed. I haven't done any longterm tests with this setting, but I assume that this will expand your battery life.

With both - NO ARC and Metal my iOS app is flying! Not only by numbers, you really feel the new performance. 

The other thing: The new Managed Records - as Marco Cantu has already blogged about it. Please follow the link!

Why are these records so interesting? Well, I have a 34 years old huge app grown from TP to Delphi. In these App, we're using mostly records for everything. At the moment we can only use short-string. 


Imagine a record with an ansi- or unicode-string like:

TFoo = Record
  Name : Ansistring;

Every procedure that want's to use a TFoo instance doing:

Procedure Bar(Var aFoo : TFoo);
  Buff : TFoo;
  Buff := aFoo;
  aFoo.Name := 'Othername';
  aFoo := Buff;

Because you only copy the reference of Name, changing aFoo.Name also changes Buff.Name. So the Buffer is not working at all!

With the new Records you have a copy method, where you can explicitly call Setlength to create a copy of the string.

The first time in history the migration from D2007 to 10.4 will saves a lot of work. We will see - still a long way.

BTW: Now it's a good time to renew your subscription. Please click on the banner below.

Happy coding with this #ComingSoon new Version of  #Delphi104!

Saturday, May 9, 2020

The database on network file problem!

Perhaps you're lucky and you're using a database server for your application. So any user is able to have full access to the server anytime. Or your application will not run on a network and you can easily use a simple file-based database like SQLite.

If not, welcome to the club of developers using no databases or any hacking tricks around the shared access problem.

There are some implementations out there in the wild with different approaches to overcome the problem.

You may ask: Why you don't just install a local database server?

Of course I could install a firebird server or the free MS-SQL Express server, but in these cases the PC must always be switched on, so that he is accessible in the network to play the server role. As always the easy solution is not possible.

Many of my clients have a really simple network using the "Fritzbox" as the router and up to tree PC connected to it. To be able to run our software on any of these PCs, without having to switch on the "Server-PC" each time, there is a NAS connected to the Fritzbox with all the data.

Shared access is handled by the file system. Yes, this is a working solution, no question! But as every application and also the stored data is always growing, there is a point where a real database would solve many problems.

Googleing for SQLite and Network you will find some implementation. uSQLiteServer, SQL4Sockets, SQLite ODBC Driver, or SQLiteDBMS. And you also find an easy protocol for handling these calls (TechFell).

Without going too deep into the research, everybody is using some kind of TCP/IP / Socket handler to restrict the access to the "database".

So why restrict to some cheap interface that can only handle the easy stuff?

Let's collect our needs: We want:
  • locking
  • an easy to use interface
  • threadsafe would be perfect
  • perhaps asynchrony access
  • some kind of remote procedure calls
  • perhaps some kind of caching? 

This should all be possible to write in a reasonable amount of time. The caching could be a challenge, but I would love to see a 64-Bit implementation of this, that is useable from a 32-Bit application, so the always empty 14 GB of spare memory could finally be filled with something useful.

I think I will start with a nice slim socket implementation, using UDP Broadcast to find other clients or "Server". Then connect over TCP, implement a simple low lever protocol for the handshake, ping, and version checking. Perhaps a plugin system that is able to auto-reload new versions from a server.

Yes this is all doable...

Why reinvent the wheel? My answer is as always: Because my wheels are running better. Or at least I think so... 

One big problem is still in my way: Find the time for doing this.

Perhaps you would like to see me live on YouTube trying this? Or my break down, because I have underestimated the problem... In any case, please leave a comment and subscribe to my channel.

I have to travel to a planet with a lower rotation speed!


Tuesday, May 5, 2020

Is a database just a data storage?

In the old days, and remember the main title of my blog: "from old school...", data would be saved in files. For the new kids on the block: A file is a storage on your hard disk, a hard disk was a device with spinning disks inside. A read/write head could store and read bytes to and from it.

So in these days we just write a block of bytes to these files, we used the "BlockWrite" command to do this! Why, because it is and was the fastest way to write a record binary to disk.

And NO, streams are not faster. Because down to the RTL, a Filestream is using the same functions, but need more calls to get there. Maybe you call "BlockRead" and "BlockWrite" the old style, but I don't care.

Before we got hard disks, we hat floppies. You got the best performance out of a floppy if you could provide a buffer to read the whole track in one rotation. If your CPU or your floppy controller was not fast enough, the sectors on the track had to be interlaced, and in this case you needed more than one rotation - too bad.

What was the title of this post?

Oh yes. We stored data, mainly records, in binary files. Sometimes we had an index. The index was a string and a seek-number. We load the index file, found the matching string, and used the seek-number to find the record in the binary file. If we had this index, we called it a database.

What about the performance? Besides the algorithm of indexing, the database should also load the data from disk and uses the same OS-functions to do this. I assume, a "normal" database that is using a file to store the data needs more than one block read, and on the client-side? For a dataset with 100 fields you have to write 100 times: FieldValue := Query.FieldByName('FieldName').AsString. This is so awful slow... With one "BlockRead" I get 10kb in a record with 1000 Fields in a blink of a nano-second. (or less). Just one call!

Perhaps knowing all this I use a database nearly the same way as in the old days. The CRUD-Way. Just do Create, Read, Update, and Delete!

That's why I could migrate all my applications to a REST-Server in minutes.

Yes I've used "Join" once or twice, and also a trigger or stored procedure, but just because somebody told me: "Let this do the database-server, the database server could do this better". In some cases this is absolutely right. Especially if you're dealing with really big datasets or/and your database is on a remote computer. That's for sure! Sending an update to a table with constraints is much easier as doing this with "Blockwrite" no question!

To have a session-based I/O while updating the customer-, the invoice- and the stock table in one call and if anything goes wrong just use rollback and not commit. Oh man that helps a lot.

In a few cases, I only read some fields of a row, but most of the time I need all fields. So, "Select *.." is the call. After I got all the data I need the mentioned field by field assignment.

That why I've programmed my JSONStore Client-Server Databasehandler in my Firemonkey Development Kit. I know the name is bad - you can uses this unit also in your VCL application!
Just select on which fields you want to have database access, all other fields are store in a blob field. Of course, I compressed the JSON before storing it. After loading the data set from the server (over REST) or from a local database you have to read your database fields the normal way and after that just let the RTTI do their JSONToObject thing. Done...

Hey compare this to the old style! A database with some keys and a blob field that could be read and write all the data in just one call to our Record/Class. We are back in the '80s well done!

One thing is different: In these days we have 5GHz, a 64Bit bit CPU, and most of the time 8 cores or more, and not 3 MHz, an 8 bit CPU, one core and only 64KB (not MB, not GB) of RAM.

But we are lucky, because with all that memory, cores, and CPU clock speed we can read our data from the database at the same time/speed as in the '80s...

I love DB's...

Thursday, April 23, 2020

Live event : The Apocalypse Coding Group.

Don't miss Part 7 & 8 of the Apocalypse Coding Group on 25. & 26.04. 14:00 UTC.

Andrea Magni
Craig Chapman
Glenn Dufke
Ian Barker
Jim McKeeth
an me...

Where MVP's are trying to convince the viewers that they are worthy of this title, although it did not look like this in the Live-Stream 1-6 with 4 hours each.
It's fun and you can annoy us in the live chat!

Part-7 :
Part-8 :

The Live-Stream is on Craig Chapman channel! Please don't forget to subscribe to his and my channel so you don't miss the upcoming events, we are currently planning together!

Legacy Applications

OK - here is the problem:

Old Application started with TP 3, grown to many Mio. LOC.
Full of Moves, Records, and other stuff that is not ready to move from pre Unicode to Unicode.

That means no real RTTI, Generics, FireDac, native HTTP, ITask, and other stuff you love if you are using 10.x. 

With a look at the roadmap ;-) the new records would help, but anyway it's a huge task to get it running with XE.

The first idea was to use a DLL - of course -  this works, but not really, because of no real working share mem and an FMX-DLL has also problems. 

So you have to serialize everything over to and back from the DLL. If you have to serialize everything you could do this also over TCP. 

The multi-user network sharing is working but should have some improvements.

A local Database would also be a good idea and take out the old Enz-ISAM that I've ported to windows a long time ago.

Installing a real DB-Server is not possible. Any options?

I could install a Service on each workstation in the network.

What can a Service do for you?
First of all - no problems with admin rights anymore. Installation of anything else is a piece of cake.
The Service Apps of each workstation could talk over a TCP connection to each other. So without a dedicated Server, the Service-Apps could name one as "The Server" if the workstation is going to do a shutdown, another workstation could be named as "The Server" from now on.
Every running App could ask the local server on - "Hey give me the IP of the Server". No need for configs. instead of 4GB the Server could use all the memory and load DLL for different tasks. 

And the client-side?
The client can use a simple interface for the Service, like OpenDB, LockTable, WriteData, UnlockTable, and CloseDB (CRUD with locking) - Every command must (again) be serialized over TCP. The Server could maintain the locking for each table. Vola - a dedicated SQLite Server, or any other DB. (And of course many more).
Internet-Updates, DBCache, and any other service that is much easier to write in XE than in D2007.

This is the Idea...

Do you want to see me struggling to implement this? Perhaps on a live Youtube-Session?
Or are your more into FMX and MVVM?

Anyway - please subscribe to my YouTube Channel and leave a comment on what you want to see next?

Have a nice separation...

Friday, April 17, 2020

Apocalypse Coding Group Part V


After the little Easter break, we will continue on the weekend (18.04 & 19.04) with our live-coding project of a simple card game. (14:00 UTC)

We are happy that MVP Andrea Magni will be part of the project this time and will surely be able to help us with his experience with his MARS server.

The live event will be hosted by Chapman World again. The whole thing is of course interactive - so if you have any questions, feel free to ask them via the chat.


PS.: Thanks to all subscribers of my Youtube channel. I would like to remind you again that I need feedback on what topics you want to see in the live coding.

Monday, April 13, 2020

CodeRage 2020

Hello, my friends!

The german CodeRage2020 is in preparation. Of Course, I want to be part of it. This time it looks like my session has something to do with threads and queues, how to speed up your application.
This topic was the chosen one from three of my Ideas. The other where:

Easy Database: How to create and work with a database without clicking components on a form


3D in 14 days: Ho to create a 3D engine for simple tasks.

I would like to create the same session also with English audio - so here is my question:

What would be your favorite topic besides my normal sessions on YouTube?

My session is live at 02 Juli 2020 in german on the EMBT Channel and in English on my channel!

Please leave a comment.

Monday, April 6, 2020

Blog Posts in deutscher Sprache.

Ich wurde von Embarcadero Deutschland gebeten auch mal Blogpost auf deutsch zu machen. Da ich normalerweise "hier" alles auf englisch schreibe...

Daher findet Ihr meine deutschen Blogposts ab sofort auf der Embarcadero Seite unter:

Auch nur fĂźr Euch deutschen Delphi Entwickler: Mein neustes Video auf meinem YouTube Kanal ist online ist.

Edit: LOL - Ich hatte den Titel geändert, aber der RSS-Feed war schneller...

Tuesday, March 31, 2020

The Apocalypse Coding Group.

I called it weekly roundtable, but Craig hat this nice idea...

So last weekend I woke up and saw, I've been invited to a new skype group called "Apocalypse Coding Group" and there were already more than 99 messages waiting. (I'm 8h behind the US or as we call it UTC+1). 

Members of the group are:

- Craig Chapman
- Glenn Dufke
- Ian Barker
- Jim McKeeth
- Frank Lauter

To keep all the #Stay@Home Developers happy and entertained, Craig would like to build a simple web application cardgame. With a nice TMS-Webcore frontend.

So with this idea in mind, he started a YouTube stream.

If you have 4h (or 8:45h for both parts) spare time and want to see 5 "pro" developers make a fool of themselves: Here is the link for Part I (4:12h) and Part II (4:26h). (so far)

Of course, if any of us had done this project alone and not streamed, everybody could have a finished, working project online, but that is not the point. The idea is to get the people entertained with this kind of interactive show.

With only a few people found the live stream - the announcement was a little bit too short - we are happy to see, that the replay is watched from all over the world. 

To answers some questions:

- The source is also online to the public at GitHub.
- We are doing this not for sale or money, just for your joy.
- There is no Patreon page.

Don't miss the next session - next weekend. We hope that more developers find the time to join the live session so it could be more interactive. 

btw.: The next D.MVVM video is nearly ready and will be uploaded to my channel shortly.

My plan was to upload a new session of my FDK and D.MVVM series every week, but I had some trouble to stay focused on my work in these days.

However, coffee is always gladly accepted, if you like our/my stuff...So stay tuned for upcoming new videos and live streams on Craigs and my channel, please subscribe and don't forget to hit the bell icon to get the notifications.

Sunday, March 15, 2020

Delphi monthly round table!

Meetings and shows are canceled all over. The next Apple presentation will also be online-only.

Let us take this as an opportunity to put a new format in place. Yes, of course, we could do webinars. But this is not so interactive besides the Q&A session at the end.

I would like to try it differently. We could use TeamViewer Meeting or just audio-only. One way to go is with a TeamSpeak Server. I have installed this kind of Server many years ago.

At the moment I'm collecting topic.

I've already talked to Jim about this and perhaps we will join with other MVPs to a monthly round table.

If you are interested please leave a comment and stay tuned for the timetable of events. This will be online in a few days.

[EDIT 16.03.20]
This is open to everyone - It would be nice if some MVP's are also online, perhaps to answer your questions. If YOU can answer a question - you are welcome to answer any question, too!

[EDIT 17.03.20]
I forgot: TeamSpeak-Server is up and running.

Password: EMBT

Thursday, March 12, 2020

ProjectH - Your solution for Project Management.

8 Weeks ago I did a survey about project management software. // Still open

The answers were as expected. "Nothing on the market works for me" was the common answer.

I've googled, I've tested some of "them" and this is my result:

a.) it is too simple and didn't fulfill my needs
b.) it is to complex and overkill for a single person or a small team
c.) perhaps it fits, but it has a monthly fee that is too high - I'm looking for a one-time purchase.
d.) it's only a website

And of course a combination of a-d most of the systems are b,c, and d. Perhaps some big companies would pay the "Enterprise" Solution for $1500 US/month, but that is not my goal.

Should it be open source or paid source... The survey ration was 50/50. 80% are not using any tool at the moment. There are 6 developers at the moment with interest in helping me build this kind of software. But 50% only wants to help if it is an open-source project.

Perhaps an idea would be to try to finance the development over Kickstarter or an equal portal, but I never tried this and I have no clue if it will work. It would be perfect if we had some kind of software, that is able to calculate the costs of development or can show how long it will take to develop this kind of software. ;-)

Let's forget about the costs or the time it will take. I think I could do my development much better with an App that can give me a good selection of items based on some rules. hmm... Here are some examples.

  • I have just one hour until the next skype-meeting. "Hey H, give me a small topic that could be completed in less than one hour"
  • I want to develop a topic today that is needed for more than one project and/or more than one project depending on this (like a PayPal interface). "Hey H, give me the most needed topic for all of my projects"
  • I'm not creative today. The best thing to do is not programming. "Hey H, give me a topic without programming" - "You could write some documentation for the MVVM Framework" - oh crap, I hate writing docs... But whatever.
  • I could get a new assignment developing an app. By predicting the time I get a sum of 240 Points. "Hey H, give me a release date if I add 240 points to my workload while suspending private projects". "You are able to start the new project in 3 weeks and it will be ready in 8 months, not harming any other work-related projects"

I know I'm dreaming... But at the moment EVERYTHING is a guessing game and I'm sure I could work better with some kind of project management tool.

So, I've got a few things I would like to see in this app. It has to be standalone Windows/Mac/Linux Version able to use a REST-Service for data storage and data exchange. A mobile app would also be useful. Perhaps some kind of time-tracking device, like this cube thing you can rotate on your desk. Or just use your phone with the app and some buttons. (Working, on the phone, taking a break)
There have to be points associated with items. Of course some graphs like Gantt or Waterfall. perhaps like in mind maps. Milestones or release/update prediction. An Interface to a bug tracker and perhaps some hooks to git or mercurial. 

I'm still collecting. What do you have in mind, please leave a comment.

Wednesday, March 11, 2020

Development environment and the ASUS support.

I spend "some" money on my PC to get a reasonable speed for my work. I didn't had the time to build my last PC so I ordered it over the Internet. Overclocked 4,5 GHz water cooling Raid 5 disk array for storage and Raid 0 for the boot drive... 32 GB RAM.

Top on the list at this time.

I upgraded my system with a new RAID controller and 4 SSDs in Raid 10. Wow this was an improvement boost. With this upgrade, I've started to work only in a VM. This was the best decision I've ever made. A little later I got the first 1080TI, which was available here in Germany.

I saw some videos about M.2 SSDs and one from Linus tech tips that trigger me. The ASUS Hyper  Card 16x. With four M.2 that has to buy my next upgrade. ~ 10.000 MB/s

I ordered this card, 4x 2TB M.2 and... Total disappointment. Only one M.2 showed up, the other three aren't working. I called the ASUS hotline and they told me: You must have a Z390 Board to get all 4 chips running. 

Since my Rampage IV Extreme-Board and CPU was running fine - I didn't want to buy anything new again, but over time, I finally ordered all upgrade parts.

- ROG Maximus XI Extreme
- i9 - 9900K - 5.0 GHz on all cores.
- 64 GB of G.Skill 3600 DDR4 - 17-19-19-39
- New water pump (last getting old)

Board, CPU, RAM, works perfectly - nevertheless, only 2 M.2 showed up... Next session with the ASUS hotline - no this config does not work. Great.

So I used the 2 - M.2's on the Mainboard as a RAID 0 - works nice. (not hardware raid). The benchmark was ~3000 MB/s read/write.

After 6 months I got an error from time to time, on one M.2. Windows could always reactivate the raid, but this is always a bad sign. This time I ordered another card: "ADWITS Quad-M.2-NVMe-SSD-PCIe-16x-8x-Adapter" it should work with every Mainboard and could provide 6500MB/s. I also ordered 4x New Samsung 970 EVO Plus 2TB. I had to order them from different resellers. Btw. it was a good time to upgrade my 1080ti to a new 2080ti.

Good news: The card works - bad news: not all 4 M.2 have the same size. 2 have 1862,89 GB, 2 have 1863 GB. That's why Windows is unable to bind them in a RAID 5 (or RAID with parity as Windows is calling this). But a RAID 0 works fine and I got 5550 MB/s read-write speed on this huge 7,5TB formatted "disk"... Not bad. VM and Delphi is working fine.

I chose  XMP II for the RAM and enabled the CPU clock-AI. It tuned down the clock a little bit - I have no idea why because it runs stable @ 5 GHz all cores in all my Tests. Since my system is running 24/7 @ 42° C or less - I'm fine with 4,8 GHz.

So far so good - Camtasia is now using the GPU for rendering. So let's do some tutorial videos on the FDK and the new D.MVVM framework.

Monday, February 24, 2020

Live Youtube, Chat, FDK & MVVM...

Since 2017 I have this topic on my todo list:

- Create Youtube Video's

After my FDK was ready for release, I wrote some kind of little documentation and some demos, but everything was a little too "short".

The other Idea was to convince people to buy my Framework. While the first customer was waiting for the version, I had to develop my fancy setup/update/shop installer. And also the local service-desktop-software to handle the versions, upload, download and payment. Too bad, at this time my MVVM-Framework (version 2 or 3) was not as good as my "final" version today.

So much to do, but no time for videos. Last time I uploaded a video to my channel is 6 years ago...

Now as my MVVM-Framework is working,  I'm able to refactor my Service-App. I also had to improve the invoice print. The first time I'm using the PDF stuff from my friends at Gnostice.

While talking to Craig Chapman he asks me why I don't do live streams... Never tried, beside one round table with Jim McKeeth. 

A live session only makes sense to me, if this is announced in advance. If nobody is watching  - especially on my channel with only 35 subscribers (at the moment) - this is no fun. Users must also be able to chat with me... My Chat-Software with WebSockets is not ready at the moment, so I installed a free Website-Chat-Script from on my homepage.

Perhaps I'll start with some normal tutorial videos for FMX my FDK and my new MVVM-Framework. Then pre-recorded tutorials with a Q&A session like the webinars.

In any case - please subscribe to my channel and hit the bell icon for notifications. If I'll get enough subscribers, I'll try to do the "live-thing" and we could also have friends like Craig or Olaf joining it.

Wouldn't it be great to have a monthly round table?

Please leave a comment! 

PS.: I have a TeamSpeak-Server running for this kind of round table since 06.10.2011. Only used this once for Delphi-Talks...