Wednesday, December 23, 2020

XMas time is Unit-Test time...

Give yourself a gift that will last longer than New Year's Eve. In many cases, you will be happy about it for many years.

There are still some grinches who don't believe in the satisfaction of unit tests and think: It's just a myth told at developer conferences.

To be honest, I program too few unit tests, but enough to find bugs I would never have discovered otherwise.

Just yesterday, a small change in my FDK would have caused numerous of my applications to stop working. However, since I use Stefan's Testinsight I got immediate feedback from my unit tests when I hit save... 

Here for example:

before:

Function FindSomeThing( const aName : String ) : boolean;
var
  lShelp : String;
begin
  Result := FindField('f'+aName.ToLower,
                          aName.ToLower,Nr,lShelp);
end;

after:

Function FindSomeThing( const aName : String ) : boolean;
var
  lShelp : String;
begin
  lShelp := aName.ToLower;

  Result := FindField('f'+lShelp,
                          lShelp,Nr,lShelp);

end;

My goal was just to do the .ToLower once. I already had this helper string for a not used param, so where is the problem?

Five tests failed after I hit save.

Let's have a look at the FindField method.

Function FindField(Const aName1, aName2; out ListNr : Integer; out Name : String) : boolean;

aName1 contains only the "f" and aName2 is empty, because of the out param. Did you know this "problem"?

So use the quiet time to write one or some unit test, maybe for a routine that has always caused problems or just to get a better source code coverage.

And if you haven't found the time to deal with unit tests yet, then use this quiet time and try it with simple examples.

Merry Christmas and a happy new year.

Frank

Tuesday, December 8, 2020

Create your Website content with Delphi!

Every time I have to create some kind of Webpage, I have to decide how to build the dynamic content. 

This is a follow up of my other blogpost about: "How to write you own Serverside-Extention!"

In the old days, it was easy! You had to provide a Webpage with a maximum horizontal resolution of 800px because this was the most used "device" (PC) resolution used by the visitors.

To keep the navigation of each page in only one place, your main design was a framed page setup. I still love this concept and I really dislike this horrible "scroll down to the earth core" pages. Of course on a smartphone, the frame concept does not work anymore.

To build a nice maintainable page you can use ASP.NET. With ASP.NET it is easy to create a master page and content pages. The server takes care of combining these two (or more) files at runtime, in nearly any other case you have to copy the navigation to each page.

Btw. if you like PHP, don't talk to me... ;-)

Creating webpages with ASP.NET and Delphi is nearly the best you can do. You have the JIT compile that creates your background binary code for direct execution, you have "native" code that is executed by the IIS and your pages are super fast. No need to stop the IIS to change the DLL or anything else.

Too bad EMBT/CodeGear dropped the support of ASP.Net after D2007. Yes, the HTML designer was bad and after D2007 there was prism. I've done some pages with prism, but this is not the same.

There is another thing called VCL for the Web. This works with Delphi forms and really long state data for the back button. This works also fine but is a completely different thing.

The other possibility is a Webbrocker application. This could be used on Linux and Windows. With a loader that is able to unload and update your DLL (Windows) you can easily update to new versions. 

If you want more information about this topic, please follow the link above.

The IIS will create a new thread for the request and executes your DLL. This is fast as hell because this is a really native CPU execution of your code. The final content as a string or a result stream is handled from the IIS and transferred to the user.

With this kind of DLL every single char/byte is streamed from your Webbrocker-Application and you have 100% control of the final result. You can load a page from the disk and process it - change tokens to different content or create a complete table from a database. This is fast, that is not the problem, but often you have to provide extra HTML content to format your final result and this is a little drawback.

Your Webbrocker application could also be used as a RESTful Webservice and the result could be XML, JSON, or SOAP, but this is not the topic for today.

For example: If you want to present a list of persons nicely formatted, you could change a token <#PersonList> in your HTML-Page to a list of persons from the database.

Let's ignore for the moment that most HTML-Editors dislike this kind of token.

You have to format the content with additional HTML-Tags like <br /> or <ul> and <li> or even a <table> with CSS. If you have a page that is static for a long time - no problem. But with every change of the presentation, you have to change your source code, recompile and upload a new DLL.

You could use an async call in javascript to your DLL for a REST-Call and process the JSON result also with javascript to the final HTML, but this is horrible from the viewpoint of a Delphi developer.

So what to do?

For many years I've used the Webbrocker approach and it works for me. This time I have the first project where the Webdesigner is unable to use Delphi and the Delphi developer has no clue about Webdesign or Javascript.

Well, Delphi Webscript (DWScript) is perhaps one possibility, but this is much too Delphi driven.

After some research, I found a project from Marco Cantù implementing Razor. Razor is also for C# and ASP.NET, but Marco's implementation is using the Webbrocker to process the HTML-Pages that contains the Razor @ Syntax.

This is really nine, the syntax is given and the HTML-Editors do like the tags with @Whatever more than the Webbrocker tags. The implementation for loops is "Javascript-Like" and it is very easy to include HTML-Tags into a loop for displaying data from a database or a simple loop. So the Webdesigner is able to change to look, without a recompile of the DLL. One thing I dislike with this approach is: 

  • You have to compare every byte in the source HTML to find the @-Tag
  • You have to syntax check the HTML-Source
  • You have to replace the @include Tags after all the processing.

Perhaps your server is fast enough to do this byte by byte comparison for every request in real-time, but this is against every rule that is deeply implemented into my development soul. 

Perhaps I should implement a converter to process the HTML-Pages and create a jump-table to do the trick? With a jump-table, you do not need to search for the @ token... There is still the include page tag problem, but if the included page has also a jump-table is more like a "just-stream-copy" thing than processing a string replacement...

After one day...

I call it a JIT-Compiler.

If the Webbrocker wants to load an HTML-Page I have to check if there is a ".htbin" file with the same timestamp. So I use findfirst not fileexists to check this (with findfirst I get the SR and the Timestamp). If there is my bin file, I'm done and I can use the prechecked bin-format -file with the Jump-Table, if not I will use the normal processing for the first time and create the bin file on the fly for the next request. This is really fast and I like this idea.

Should I include a Delphi-Script tag to process Delphi source? Perhaps in the future.

Do you want to see a Webinar for this? 

Do you want to use this for your own projects? Should I do a git fork of Marco's repository and put this on github?

Please leave a comment...

And as always: Please visit my youtube and hit subscribe to help me grow my channel!




Monday, October 26, 2020

Setup your Bitbucket replacement for your Mercurial repository

A long time ago the German DelphiPraxis community convinced me to abandon my previous strategy of saving my source code via zipping only and use a source code management system.

So I googled a little bit and found SVN.

After some days of getting used to it - I proudly announced: "I have switched to SVN"! And with that, the shit storm has begun.

Why aren't you using a distributed revision-control-system? Don't use this old SVN system! Uses Mercurial or Git!

So I took the first "Mercurial" and with a little help and a talk from MVP Uwe Raabe on one of the Delphi-Tage-Event,  I had to realize that this system "just" works very well and offers all the advantages I missed in SVN. ( Sorry Craig )

Pushing everything for backup to a network-driver or a USB-Stick works perfectly. So no need for the old SVN Server installation on my local server, because mercurial is file-based. But after some time I had to share my source code with other developers and here comes BitBucket into play.

The use of BitBucket was as easy as Mercurial itself. 

I've never used Git, but I was told that it was just like Mercurial. Unfortunately, this is not the case. Even the first time I used git, there were big differences and I immediately hated git. But this was not a big deal, because I could use Mercurial for my local repositories and use Git for sharing where necessary.

I know many more developers using Git that Mercurial, but I never got used to the remote and local heads for different users and the parallel branches are show differently comparing to Mercurial.

After Bitbucket broke away from Mercurial repositories and hg is no longer supported, I had to find an alternative.  After some testing, I was able to run the SCMManager on a virtual Ubuntu installation. With the online available tutorials, the installation was not complicated. Unfortunately, SSL did not work out of the box.

The SCMManager is a Java-Client and this is not so easy with the certbot for Let's Encrypt. As far as I know, you have to convert the certificates and install them into the java key store. I like the cronjob based renewal, so hand converting was not my goal.

I found a little hint, that the Nginx server could be used as a proxy server to do the https/SSL connection and leave the java installation without any changes.

Installing software on Linux with this apt-get stuff is really easy and I wish this kind of installation would also possible on Windows. One drawback often is the location or the necessary changes to config files by hand. In this case, I had to type the config file from scratch.

By adding a new A record to the nameserver I was able to run the certbot with the "--nginx" param script who not only installed the certificate but also added the necessary parameters to my handwritten config file. Well done - this kind of "service" is really nice...

With some tweaking and some reload/retries, the server was reachable over https!

Small repositories worked immediately, but large files could not be transferred and caused an hg error: "request entity too large". The reason for this problem could only be the installed proxy server because everything worked before. But also this problem could be solved by a little googling. 

So now I have my own Mercurial server. If you want to know more about this, please write this in the comments, then I will try to collect my google knowledge into another post.



Friday, October 9, 2020

How to write you own Serverside-Extention!

Serverside Extention (IIS or Apache)!

For many years I have my own webserver. In my case, this always was a Windows Server, because I want to do my serverside scripts in Delphi! Apart from that, I had no clue about a Unix/Linux Server!

Being part of the Apocalypse Coding Group opens this door and in many hours of development of our little game, I could prove, that the same ISAPI.dll could also be compiled and executed on a Ubuntu Apache Server! 

I love Delphi. 

Building a new website or web service in the past, I had to make a choice:

For the website:

1. ASP.NET (Too bad only up to D2007)
2. ISAPI.DLL

After using ISAPI.dll's for a long time I've more often used ASP.NET. The code-behind for a webpage is not locked by the server, so it was easy to update the DLL-files. This was also possible by using the well known "egg"-loader. Updating the server to a newer version suddenly prevented the loader to work and I had never the time to find the reason for this. That why I preferred ASP.NET and for the other ISAPI.dll's I always hat to start and stop my IIS-Server.

For the web service:

1. SOAP

SOAP was the way to go because it was so easy to use. Just define your Interface and use it as it is local in your application. The only drawback is the huge data that is transferred with every call. That's why I used this not in the "native" way but just to transport an encrypted, compressed binary stream.

This was the past... But as time goes by...

If your traffic is increasing and you want to do many more things with your server, there comes a time when you realize that a 2 core server with an ancient operating system is just not up to date anymore.

Virtualization is one way to go. In my case, I use Proxmox on a beefy 2 CPU, many cores,  many RAM Server with a RAID SSD drive... Nice...

This server handles two or more Windows 2019 Server and many Ubuntu Servers for small web- or microservices.

Of course, for web-services "we" are doing JSON/RESTfull web services these days. So what is needed to do all the fancy stuff? 

ISAPI.DLL's - best performance - native execution, no script interpretation. For a web-site there are two ways to go: (perhaps many more, but not for me)

1. Uses a "normal" HTML page and  AJAX background calls to get the dynamic content you need for your site.

2. Use the Webbroker implementation and deliver the content directly from your DLL. This content could also do REST-calls to get more data from the same or a different service.

Perhaps you also want some URL-Rewrite or another Server-side handling, in this case, you also need a Filter.dll.

This time I'll do it better...

With the newly installed Windows 2019-Server, I tried the simple Hello-World DLL that is included with the egg-loader. But the Server was unable to execute this little DLL. But a request to the URL just downloads the DLL. This was the same with other DLL's I just copied from the old server. (The Hello-World.DLL works without problems on the old server). After some googling. I found the "uses 32 bit - DLL's" setting. But with this setting, I got a 500 with a weird error message. Setting the error level to detailed brings more information. Spoiler: Do not google these errors! Everything you'll find is nonsense! Like: You have to disable compression or other things. Most of the ideas lead you to the Handler settings - also wrong. (of course, you have to install all the ISAPI IIS extensions, I assume you have done this). So what is the difference between the old and the new server? 

64-Bit!

OMG... Why is there a setting for 32-Bit DLL's if you have to use 64-Bit anyway? OK - Start 10.4.x, compile it to 64-Bit and try again. To bad the Loader-Version 2.0 is from 2005 and not Unicode. After some fixes - the same problems. Forgot to disable the 32-Bit setting. The Hello-Worlds.DLL brings no errors anymore but is download again with every call. The loader is "kind of " working. All log files have wide-strings. Unreadable. Again some Unicode-Fixes and recompiled the Hello-World.dll to 64-Bit...

And here we go... a nice "Hello" displayed in the browser. But the Egg-Loader is unable to update the *.run version with the *.update version. I think the IIS is still holding the DLL. (How is this possible). So I wrote an eMail to William Egge if he has any ideas... 

This error is impossible. The loader is working, the hello-world is working, a free library should also work and the IIS should have no knowledge about the site-loaded DLL. Why is the loader unable to rename the files?

OMG... Again... No rights! 

By giving the IIS-User the access rights everything is working perfectly! Of course, I wrote to William that everything is working now... Answer from Bill: "Ok Cool 😊"! Perhaps I will send him the changes or I will write a different version, that is also able to handle Filter.dll's. Not decided yet!

So here is the workflow:

1. After installing the Windows-Server and all necessary extensions.
2. Create a website!
3. Compile your WebBroker.DLL (64-Bit or matching the Server)
4. Create a directory outside of your web-page-path. (e.g. c:\API / c:\ISAPI / c:\Script)
5. Create a minimal web.config in this directory. (for debugging errorMode="Detailed")

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
    <system.webServer>
        <handlers accessPolicy="Execute, Script" />
        <directoryBrowse enabled="false" />
        <httpErrors errorMode="Detailed" />
    </system.webServer>
</configuration>

6. Upload the DLL to this directory and rename it to WebBroker.run (or any other name).
7. Upload the loader.dll to the directory and rename it to WebBroker.dll (Same name as your DLL)
8. Add the IIS-User with access rights to this directory
9. Add an Application to your website in the IIS-Manager. Point to the ISAPI directory.

(The Name will be part of the URL) "API" -> https://www.yourdomain.com/API/WebBroker.DLL)

10. Read is not permitted and browsing is off, but anyway I would store log files outside of this directory.

With the redesign of the egg-loader, we are back in the game!

If you want to see a YouTube tutorial of this, please leave a comment. There will be a video about this as part of my Firemonkey Development Kit and the Server-Side of my JSON-Store implementation, but this is scheduled not before I've completed my #D.MVVM Framework series and more basic stuff of my FDK.

If you want a notification on new videos, please subscribe to my channel and hit the bell icon! Thank you, this helps to grow my channel!

Tuesday, September 29, 2020

Workflow and multi - binding with #D.MVVM - The Delphi MVVM Framework!

Workflow and multi binding with #D.MVVM!

In every example for MVVM, you find the three blocks diagram explaining the binding between the View, the ViewModel, and the Model.

If you dig deeper into a real-world application, you may use the 1:1:1 relation, but in most cases, you need it very differently.

1:1:1 means One View connected to one ViewModel, connected to one Model!

Let's start with the View:

Most of my views are not simple TForms but much more assembled on small subframes. The subframes are also TForms and not TFrames, because of the recurring problems with frames (in the past). But that is not the topic.

The View will be created in the composition root - or elsewhere - and then?

Every subframe must also be created and placed into the target frame of the parent view.

So far so good. Of course, this is all handled by my framework.

In some cases, every frame should have its own ViewModel. But what if not?

Well, keep this in mind and start at the other end of this chain...

We have a database where every row is just an entry of our beloved TPerson, with the normal address fields.

Part of this TPerson is a list of telephone numbers handled in a different table and perhaps also a list of bank accounts also in a different table.

As we all know - I hope - the Model is not our Database interface, the Model will get the interface to the database for local or remote storage. This could be an interface to a local database or to a RESTfull web service or something else!

Let us talk concretely for the moment:

To store a value into a database or send it as JSON you need the field-value. For the complete row and perhaps for the subtables you need a list of field-values and the field-names or at least the index position in the param list of the query.

We will name this: Place X.

Place X need the Name, Street, Zip and more... And of course, the two lists for the subtables.

To be able to get this right you have to collect the data. A good place would be the Model. Wait... The Model?

So for all the different fields and from at least 3 frames, we have to collect the data in only one Model!

To store the data, the database should do a transaction to keep the subtables in sync with the address table. This would be a nightmare if 3 Models would do this separately.

At this point, we have our View with 3 subframes (Address, Telefonlist, Bankaccountlist) and only one Model.

1(3) : ? : 1

The View is not the place to collect the data from the subframes. Should we bind the subframes to one ViewModel?

In this case, we get our example relation of 1:1:1. That would be nice, but will it fit our needs? One reason to separate our View into SubViews is the possibility to swap out a subview to a different version. To check if the subframe data is sufficient to be stored, I would like to have a simple ViewModel per frame. Let's collect:

PersonView -> PersonViewModel -> PersonModel <- Database/Rest Interface

AddressView -> AddressViewModel -> ?
TelefonView -> TelefonViewModel -> ?
BankView -> BankViewModel -> ?

The PersonViewModel has no fields (or just two buttons).

Even if we do not store data in visual controls, we must define the controls. Lets name this Place B,C,D for the three subviews, and Place A perhaps for the PersionView that only has a clear and save button.

In the PersonViewModel we have to declare a Field/Property canSave (Place A1). Inside the PersonViewModel we must be able to check the three other ViewModel so every ViewModel from the subviews should be connected to the "parent" ViewModel.

In the AddressViewModel we have to declare the Field/Properties for the Address. (Place B1) And the same for the two other ViewModels (Place C1, Place D1).

In a best-case scenario, we would skip the Models for the subviews - if not we would also have Place B2, C2 and D2.

Is MVVM slower in development because you have to type everything three times?

Normally: Yes, it is!

Don't forget we're doing MVVM to get things faster developed! 

How can we archive this?

One major point for MVVM is Testability. Perhaps we are slower in the first place, but with a huge project and a lot of Models - if we have really good test coverage - we are much faster than other projects after a short period of time. Especially if you're doing TDD. 

If you haven't seen my first two videos on YouTube, please take a look. And also please subscribe to help me grow my channel!

My idea is auto bind by name, this saves you a lot of time. The other idea is now to store field values on the flow from component to the final database.

Let's take a look at the full development cycle and for simplicity name it in Delphi terms:

Create a TForm for the subframes, name it TPerson.View
Create a TForm for the address, name it TPersonAddress.View
Create a TForm for the telephone numbers, name it TPersonTelephone.View
Create a TForm for the bank accounts, name it TPersonBank.View

In the composition.root: Register the View's and implement the create for the Forms.

Create the 4 ViewModels.

As the ViewModels are "just" classes derived from TViewModel, we only need to implement a property for all components on the TForms. This is the place where we would like to store our values. To speed up things, we don't create local fields as 

FName : String;

we create this local field as

FName : TProperty<String>;

Auto-Binding will do the rest and connects the Name : TEdit from the Form to this "property"!

Create an empty TPerson.Model as TPersonModel = Class(TModel).

At this point - if you're doing TDD - I would create a TPerson.Model.Test.pas and also a TPerson.ViewModel.Test.pas. But this is a topic for a different blog post.

The base TViewModel class has a procedure call ViewModelSet that is called if the ViewModel is set to the View.

If this event is fired, because you set the property in the composition root like:

Result := TPersonView.Create;
LViewModel := TPersonViewModel.Create;
LViewModel.Model := TPersonModel.Create(TPersonDB.Create('local'));
Result.ViewModel := LViewModel; // here

By setting the ViewModel Property at the View, the Framework is doing the auto depending subframe load. So if everything is ready to go you have the ViewModelSet method that is called to setup the multi bind.

So we have:
1(3) : 1(3) : 1 chain!

If our MainView / MainViewModel wants to create our PersonView it just calls "Show('PersonView')" as part of the Navigation Service - Bang - everything will be created and everything is wired-up for the View and the subviews.

Now - how to transfer the Data from the View to the ViewModel to the Model to the DB and even from SubViewModels? Sounds complicated!

Let's compare RAD vs. MVVM!

In a "normal" RAD App you will have to define some kind of data structure to keep your data in memory. This could be a Class or a Record with all your field-values. In our case, there will also be some kind of TList<TBank> or TArray<TTelephone>. To get your data into your data structure you can use the live bindings, to a dataset or in the old days perhaps a DBEdit to do this the direct way. Really horrible, because while viewing the data, the database connection must be kept open or copied to a mem-table. Perhaps you have done it the old way by manual setting the values to the component and back by calling a method like:

FormToData, DataToForm where you have to set every field to the component.

EditName.Text := Data.Name;
EditStreet.Text := Data.Street;

// ---

Data.Name := EditName.Text
Data.Stree := EditStreet.Text;

Even if you use the live bindings this could get messy if you have many controls on a Form, and of course, it is horrible to maintain, because all changes are in the DFM/FMX file not so easy to merge in your repository.

Designing the Form is the same for RAD and MVVM.
Instead of creating a Class to hold your data, in MVVM, you create a ViewModel. (Perhaps some tutorial says that the data is stored in the Model, but this is not my approach). Creating the ViewModel should take the same time as creating your Memory-Class. The auto-bind will be faster the every other RAD approach.

1:0 for #D.MVVM.

Autoload of depending Frames would be 2:0 for #D.MVVM, but you could use real frames so this would not count. But setting up subframes in the composition root is easier to maintain, so let's call it even and don't count this. So we are still at 1:0.

Using the ViewModel for a Unit-Test to check if the Form works as expected: 2:0 for #D.MVVM. This should count 3:0 because Unit-Test with the RAD-Form is not really an option.

So for the design part, we are faster with #D.MVVM than with the pure RAD approach. 

And this is where the problems begin: We don't want to redefine all the fields in the model and thus double our effort and lose our time advantage. We also want to save ourselves the typing work and don't want to double our source code, which we have to maintain in parallel. But in some cases, we have to convert fields to be able to write them correctly into the database. The same is with the database itself. We don't want to create a separate definition for our database by copying the field from the ViewModel. 

Perhaps you use a tool to design your database or some kind of Modelmaker stuff to create your DB and Classes! From my point of view, this is not the way to go. I want to have everything in the source to keep track of it over the source repository.

How can we archive this?

Well, one idea is to use an on-the-fly created data context that is able to receive or set the property fields in the ViewModel and also convert the data-fields while sending them from the ViewModel to the Model and after that to our database interface. If we finally have a wizard in place for creating ViewModels and Models insight the IDE, it will be a piece of cake to do this. Without a wizard perhaps we have to rebuild the structure only one more time to create a universal data-structure that is able to deal with our ViewModel, our Model, and our CRUD-Database.

Without this approach let's collect our places where we have to copy all the fields. We need a copy in the Model (A2, B2, C2, D2) and in our database (at least in the SQL statement to create the DB and in every query to read or fill the values) (A3, B3, C3, D3) and as we remember Place X for the JSON.

This sounds like overkill and throws a bad light on MVVM. 4 Places to copy all fields... Horrible

That's why I do it in a different way with my #D.MVVM framework. Create your ViewModel, use the ViewModel definition, and some attributes to define an ORM-Like Model for the database and also for the Model, define some converter methods to handle or change field or field-types. That's it!

No need to define any fields in the Model: 2:0 for #D.MVVM.
Just define converter if needed: 3:0 for #D.MVVM.
Create the database with just one definition: 4:0 for #D.MVVM.

And don't forget this is VCL and FMX. The next step is a View to ViewModel wizard to convert your legacy VCL app to VCL-MVVM or FMX. 

This could all be done with the pure #D.MVVM Framework. Combining this with my FDK - The Firemonkey Development Kit, that contains many units and functions that are also usable with VCL, provides the intuitive MVVM developer with a tool that can not only create even the most complicated application with numerous interfaces in a very short time but also keep it maintainable for many years.

So stay tuned for the next #D.MVVM video on my YouTube channel!


Tuesday, September 15, 2020

Submitting new Apps to the stores!

I do not often need to update my main App in the App Store or Play Store.

But every time I have too, it's a pain in the a**.

Don't get me wrong, upload a new version of my App to google, is done in 2 minutes and the App is nearly instant online.

My daily driver is an iPhone and I also have some iPad's, but the store with all the provisioning, certificates is a pain.

But even if you've managed to get all your certificates right, there is nearly every time a new key or entry for the *.info.plist. Most of these are well handled by the IDE. 

Then you upload your binary, well and nearly every time, because of all the changes by Apple, you get a weird error. After spending hours of googling, perhaps you find out the reason.

This time it was the multitasking orientations for the iPad.

All the keys like UISupportedInterfaceOrientations~ipad with:
  • UIInterfaceOrientationPortrait 
  • UIInterfaceOrientationLandscapeLeft
  • UIInterfaceOrientationLandscapeRight

The error was missing the keys form above found key from above.

<key>UIRequiresFullScreen</key>
<true/>

fixed the problem.

Then I saw that I've not included the 1024x1024 px icon - (because this is new).
After selecting the right icon - you have to increase the version number.

Submit and wait for the review. 

My App is in the Store since 2013 - this time they want to have a video from a physical device, showing the use of the location sensor. (Ok, this was an old  function that is no longer in use, so I disabled it)

And I've tested my App against iOS 14, to be sure that everything is still working...

App rejected because I mentioned iOS 14 in the what's new part.

So again a new Upload and a new version number! I really like my iPhone, but from the point of a developer - Android is so much easier to handle.

The new version is waiting for the review, we will see...

Monday, August 17, 2020

#D.MVVM - At what point is a framework ready for release?

Besides some bug-fixes and the typical daily stuff, my focus is still on my Delphi-MVVM-Framework. 
#D.MVVM

Why, because I want to port all my old applications to this and I promised myself not to start a new project until the framework is ready so that I don't get the idea of either using an older version of it or not using MVVM at all.

There is also a dream, that I'm able to port our 35 years old main VCL - Application from D2007 direct to 10.x - #D.MVVM - at best direct to FMX.

So, I have to make sure that I can call the framework finished, which means I also have to set the needed functions to the status "finished".

Of course, we all know: With such a project, as with any other application, there is never the status "finished". So how do I find out when I'm just "tweaking" it? Ok. first of all, I need a feature list. I would really like to have a unit test for each feature. Second I need sufficient documentation. (Oh, boy this is a breaking rule. I have no documentation at all - so far).

But what's about clean code - or at least cleaning up the code (a little bit)? Yes, this is 100% necessary. 

How do you find out, if an entry in the feature list is a really necessary feature and not some nice functionality I just want to add? I really like to read your comments about this. What are the core functions, you think must be included for a 1.0 Version of my framework?

Before we try to collect some features we should define some rules.

To implement an application with the MVVM-Pattern we want:

  1. No, or nearly non-code in our View! Except for UI stuff!
  2. Comparing a RAD-App with an MVVM-App, MVVM-App should be as easy to create.
  3. The bindings are the biggest problem, they must be as easy as possible to set up.
  4. No use of Anti-Pattern. (Some people are calling a ViewFactory already an Anti-Pattern)
  5. At best: Just create both kinds of  "Forms" and you could compile it to VCL or FMX with the same codebase.
  6. Every part of your application should be unit-testable, so no dependencies between your modules. 
  7. Don't break the layer concept.
  8. The framework should work without the need of the FDK. (I had to duplicate some stuff)
  9. D.MVVM integrates as a plugin to the FDK to benefit from all the higher-level functions.
  10. Don't use the visual live bindings.
  11. Everything should work, without clicking non-visual components or special components on a form.
  12. Naming conventions over configuration.
  13. Don't care if a C# developer thinks you are doing it wrong! 😄

Anything else?

If you know me, I really dislike clicking things "together". I like to do everything in code - except the form of course - so that every change is noticeable in the source-code repository. Perhaps I create some components at the end, to fit the needs of the RAD-Development, but that is not yet decided.

While creating a feature list, I can also create some tickets to keep track of the development process.

My D.MVVM Framework should contain:

  1. A Wizard to create a new D.MVVM Application.
  2. A Wizard to create a new View (Form or Frame).
  3. A Wizard to create a new ViewModel.
  4. A Wizard to create a new Model.
  5. A Framework (VCL & FMX) independent overload of the TForm implementation to give every Form some base functionality. (done)
  6. A central point to do all the joins or Viewchains. (done)
  7. All included VCL and FMX components should be bindable. 
  8. 3rd party components should be easy to include in the binding schema.
  9. Bindings from the View to the ViewModel should be possible without writing any line of code. (done)
  10. Bindings from the ViewModel to the Model should also be possible without any code. A value converter should be included in the data transfer event. (done)
  11. A data context should keep the data in memory to be displayed. (done, but not fully tested)
  12. Some basic services should be included. (Action-Service, Navigation-Service, Menu-Service, Dialog-Service, Time-Service) 90% done.
  13. A List-Adapter for handling huge Data in Grids, Listboxes, and Memos. (done)
  14. Public Unit-Test with 100% code coverage. (Some are done, but I have not measured the coverage)
  15. VCL and FMX demo Apps. (some are done)
  16. Documentation.
I think this is not the final version 1.0 list yet, but almost.

Again: After watching Uncle Bob's clean code lessons - I have to do some heavy refactoring...

If you have any suggestions for me, I would be very happy about your comment on this topic!

If you're reading my blog for the first time, please take a look at my Youtube Channel for my already released D.MVVM-Videos.

Please don't forget to subscribe and hit the like button to help me grow my Channel.














"New" Video about SQLite in Threads is online!

Ok, of course, this Video is not really new...


I just could not find the time to do the translation and record all the translated voice-over.

This is the second video of my German CodeRage 2020 session.

In the German language, the same sentence take sometimes twice the time to say, comparing to the English version. That's why the Videos has some "silent" parts.

This is the second time I've tried to use the Speechelo AI engine to create the English voice-over! It's not so bad and since I haven't gotten any bad feedback by using this engine to produce a video, It looks like you agree.

But I think next time I do it by my self again.

Producing an English video and after that translate it to German is much easier than the other way around.

So have fun.


Tuesday, August 4, 2020

Unicode migration is harder as everybody told you - if you are old.

Harder? Why harder? Everybody is telling you - it's easier as you think.

With every new Delphi Version there is this kind of movement:

Do your Unicode conversion today and use our new Rad-Studio-Version to do your work even better as before. ( Combined often with a special offer )

I fully understand this and, to be honest, this is true and it's also a good idea. Using old Delphi-Versions is horrible. Yes, the old versions are a little bit faster and perhaps the IDE is a little bit more stable. But this is a bad tradeoff because you miss all the new stuff that makes your life so much easier.

Ok, if everything is so fine and easy, why this blog post?

If you have old source code in your legacy application and the oldest unit is created after the year 20xx, you probably have no problems. Because you've used the Rad approach and have clicked DB Components on your form direct bound to your database. Perhaps you used the BDE, but converting this to Firedac is not so hard.

So far so good... Buy the latest Delphi Version to go ahead...

But if you're old and you really learned how to code in the old days with Turbo-Pascal, you are not using DB-Components at all. 

You are using a Record to hold your data. The records had to be aligned by byte or had to be marked as Packed Record.

You store a date in 3 bytes and of course, you had to use typed string to match your needs. 

In these days you had this kind of declaration because every byte counts:

Type
   Str6  = String[ 6];
   Str26 = String[26];
   Str40 = String[40];
   Str80 = String[80];

   TAddress = Packed Record
     FirstName : Str26;
     LastName  : Str26;
     Street    : Str80;
     Zip       : Str6
     Town      : Str80;
   end;

var
  Address : TAddress;

With this definition, you just wrote an address to disk by using blockwrite.

This kind of code works with TP 1 up to Sydney. So what is the problem?

Changing these (short)Strings to a Unicode-String or even to an AnsiString, You are unable to write them to disk anymore because a long string is only a reference, not the actual data. No Problem you just have to serialize the data for read- and write to stream. Each dataset in this stream then has a different length and you have to create a jump table to find the record starting pos. This is the best point to store your data not in a flat-file anymore.

Let's assume you have managed this in your whole app...I still haven't yet.

Long before I had converted my source code to able to compile with XE, I thought it is a good idea to convert every String to an AnsiString, every Char to an AnsiChar and every PChar to an AnsiPChar. 

(At this point I was not aware, that the VCL is not working like the Windows API where nearly every function has an A and W version for Ansi and WideStrings.)

So each

Label1.Caption := Address.Firstnames;

produces a warning while compiling. The Compiler is doing the trick here to convert a short string to the Unicode-Caption-String - so it works. btw. this is called a possible solution in most whiter papers. It is not - Yes it compiles and yes it runs, but the warnings are the problem, we will see later.

Again where is the problem? I call it the "Var Param Problem". 

Imagine you have the record from above. Then you probably have some methods to deal with this data structure, like:

Function MakeName(var Firstname, LastName : Str26) : Str80;

Using var was a good idea and not to copy 54 bytes on the stack.

Perhaps the MakeName function calls another method inside:

Function Crunch(var S : Str26) : Str26;
begin
  While (S[0] > #0) and (S[byte(S[0]) = ' ') do
    S[0] := Chr(Byte(S[0]) - );

  Crunch := S;
end;

Function EmptyName(var S : Str26) : boolean;
begin
  Emptyname := (Crunch(s) = '');  
end;

Function MakeName(var Firstname, LastName : Str26) : Str80;
begin
  if EmptyName(Firstname) 
    then MakeName := crunch(LastName)
    else MakeName := crunch(LastName) +', '+crunch(Firstname);
end;

Yes, if you did not know this - before RESULT was the way to go, assigning the value to the function name was the way you had to write functions!

There where more stupid things we had done in the old days.

Fillchar(Address,sizeof(Address),#0); 

Kills the reference and produces a memory leak if you have a long string in the record.

Move(Address, OldAddress, sizeof(Address));

Moves the reference and doesn't create a copy of it.

With hundreds of Methods, using short or special short string types as var params all over your source code, there are no easy ways to convert all of this step by step to a different string type. Once you have changed one function that is called by nearly everybody you get this snowball effect of compiler errors.

Yes, most of the conversion beside the "Var Param Problem" is done by the compiler. The problem is: You will get thousands of warnings and you can not ignore them, because some of them you have to deal with. 

The hard task is: Find the 500 or more problems out of 60.000 warnings. I've started with XE2 and by ignoring most of the "probably lost" warnings because you assign a string to a short string, I'm down to ~7500 warnings.

But here are some good news. With 10.4 we get new managed records that can have an assign and copy method. This is the first time there is a possibility to handle long strings in records the right way by creating a new instance on copy.

It would be so perfect if we had a compiler switch to turn of Unicode in XEx - sorry just dreaming.

While refactoring the code to XE - of course, everything has to be binary matching 1:1, so every not converted data-structure still written to disk is still the same data. By using any sourcecode repository you are able to maintain both trees (Unicode and none Unicode) for a small-time until the conflicts are too much. So the only way to go is to be able to make changes to your non-unicode source, which produces the same results with fewer warnings if you take the XEx compiler. By going this way, you want as little as possible IFDEF's in your source, because you've already a big pile of code to refactor and IFDEF's in the long term makes it more unreadable.

So, how to find a way to go?

It would be perfect if I could keep the short strings in my records as long as possible. Outside of these records, everything could be "normal" string.

- impossible

Everything is calling everybody and nearly every unit is linked to every unit - not always in a direct way, but over some unit in most cases.

Side note: Don't watch Uncle Bob's Clean Code stuff, because after that you will hate your work of the past 35 years much more you already do. (were young we needed the money)

No this is wrong: EVERY Developer should watch this Session 1-6 from Uncle Bob.

Stop reading here and click on this link and after that - if you still think, you do not need to write unit-tests. Sorry, in this case, I can't help you at all. (The only possibility is: You have skipped some parts of the videos).

If I had a trustable coverage of unit tests already in place in my legacy app, EVERYTHING would be so much easier and no fear of changing some of the old methods and dependencies.

And now? I've started a different approach - by using my source-code-tokenizer from my sourcecode formatted project, I was able to find all dependencies in all my Unit-Uses and removed ~400 Units that are not needed anymore in "this" unit. Not bad, but not enough.

I plan to restructure my dependencies with a brute force or NN network - we will see if this could work. The next thing would be a "path through the source" finder, to convert all Var-params from short to string where possible.

So a long way to go and in the end, we are on VCL 32Bit. 64Bit would be the next step. If everybody is going the ARM-Ways perhaps we have to convert to FMX some time in the future.

To get something done, I'm going to ways, first decoupling and writing unit tests, and second try to get rid of the short string or at least the none critical warnings.

And of course everything just in my spare time. I wish I had...

If you have any good ideas - I like to read them in the comments.

So long... Happy coding and watch Uncle Bob!

Friday, July 24, 2020

My Fight for FMX!

If you know me or if you read my blog, you probably know, that I stand 100% behind FMX.
Well and I'm "into FMX" since XE2, but let's say with XE6 or better XE8 it was really useable for every kind of development, desktop, and mobile.

For many years I felt a little bit lonely in this field of development, but since XE10 the club of FMXer is growing faster and faster.

Finally one of the VCL Component developers is using the FMX road, too!

So please clap your hands and welcome DevExpress to the club.

Fun fact: The blogpost about the new DevExpress FMX Grid CTP is posted at/from the VCL Team with - still the VCL logo.

They also promise to include "every single VCL product" in their FMX offering... I think this is a term for: "We will provide an FMX Version of all our components".

So to all haters who haven't taken FMX seriously and/or my love for it for years, I would like to answer with a quote from the blogpost of DevExpress: 

"If everyone moves to FireMonkey, we’ll be sure to follow."

Have a nice weekend!

PS.: If you want to start with FMX, don't forget to buy my Firemonkey Development Kit.

Saturday, June 13, 2020

CodeRage 2020 - Quickinfo...

Oh boy... Why has it been always so hard to do the easy stuff?

My Idea was - just make to sessions, do some advertising on a sublevel and I'm done...

But for session one, I had to explain so many things and create "some" screen-shots. 
Normally a 45 min. video takes ~12h to create. Sometimes a little more, if I have to develop the examples first.

I'm getting faster on "the creating video task", but this time...

The first 10 minutes cost me 4 days. Perhaps I'm doing something wrong.

And session 2?

My estimates where "double the time" from session one... I did not even start the creation. 

It's time to speed up.

I'm also getting faster for the "two audio tracks" thing. Synchronizing the English and German audio is not a big deal anymore. (OK, it still cost 2,5x the time comparing to a single audio track, but faster than doing the disturbing subtitles)

I hope you will like the two sessions and the work was not in vain or different than you expected from the title.

So don't miss it on the 2nd of July! (English versions of my videos are online while streaming the German Versions).

PS.: Of course if you're are a user of my FDK - all the stuff that is shown in the video doesn't need to interest you, because many more sophisticated routines are included in my framework. Perhaps take a look anyway, perhaps just for fun.



Monday, May 25, 2020

German CodeRage 2020

The German CodeRage will be live on the 2nd of Juli. I will present two sessions.

  1. 09:00 UTC Threads and Queues (11:00 MEST)
    How can I accelerate my application by using Queues and Threads to execute some workloads in the background? My session will show some examples of how to include this kind of asynchronous data processing in your app.
  2. 16:00 UTC SQLite in Threads (18:00 MEST)
    How can I use threads to access an SQLite-Database? Spoiler: you can use the technics from session one.
If you want to see my session live at the CodeRage event, you should register and you can also be part of the Q&A session.

If you prefer to watch the session in the English language, it will be online on my Youtube Channel at the same time!

So stay tuned and please subscribe to my channel perhaps you will find some teaser ;-)!





Monday, May 11, 2020

Delphi 10.4 Sydney - #Delphi104 - #ComingSoon

Hello, my friends!

Yes, I've got permission to blog about the upcoming new version of Delphi 10.4 Sydney!.

First I want to answer the most urgent question for all FMX-Mobile developers:

What about ARC?

It's gone, it's history, I hope I will never see any kind of ARC again!

Next on my list is Metal... No more OpenGL-(ES) on iOS.

It has a nice new feature: You can set your framerate, e.g. fix to 60FPS or only refresh the screen if something has changed. I haven't done any longterm tests with this setting, but I assume that this will expand your battery life.

With both - NO ARC and Metal my iOS app is flying! Not only by numbers, you really feel the new performance. 

The other thing: The new Managed Records - as Marco Cantu has already blogged about it. Please follow the link!

Why are these records so interesting? Well, I have a 34 years old huge app grown from TP to Delphi. In these App, we're using mostly records for everything. At the moment we can only use short-string. 

Why?

Imagine a record with an ansi- or unicode-string like:

TFoo = Record
  Name : Ansistring;
end;

Every procedure that want's to use a TFoo instance doing:

Procedure Bar(Var aFoo : TFoo);
var
  Buff : TFoo;
begin
  Buff := aFoo;
  aFoo.Name := 'Othername';
  //...
  aFoo := Buff;
end;

Because you only copy the reference of Name, changing aFoo.Name also changes Buff.Name. So the Buffer is not working at all!

With the new Records you have a copy method, where you can explicitly call Setlength to create a copy of the string.

The first time in history the migration from D2007 to 10.4 will saves a lot of work. We will see - still a long way.

BTW: Now it's a good time to renew your subscription. Please click on the banner below.

Happy coding with this #ComingSoon new Version of  #Delphi104!






Saturday, May 9, 2020

The database on network file problem!

Perhaps you're lucky and you're using a database server for your application. So any user is able to have full access to the server anytime. Or your application will not run on a network and you can easily use a simple file-based database like SQLite.

If not, welcome to the club of developers using no databases or any hacking tricks around the shared access problem.

There are some implementations out there in the wild with different approaches to overcome the problem.

You may ask: Why you don't just install a local database server?

Of course I could install a firebird server or the free MS-SQL Express server, but in these cases the PC must always be switched on, so that he is accessible in the network to play the server role. As always the easy solution is not possible.

Many of my clients have a really simple network using the "Fritzbox" as the router and up to tree PC connected to it. To be able to run our software on any of these PCs, without having to switch on the "Server-PC" each time, there is a NAS connected to the Fritzbox with all the data.

Shared access is handled by the file system. Yes, this is a working solution, no question! But as every application and also the stored data is always growing, there is a point where a real database would solve many problems.

Googleing for SQLite and Network you will find some implementation. uSQLiteServer, SQL4Sockets, SQLite ODBC Driver, or SQLiteDBMS. And you also find an easy protocol for handling these calls (TechFell).

Without going too deep into the research, everybody is using some kind of TCP/IP / Socket handler to restrict the access to the "database".

So why restrict to some cheap interface that can only handle the easy stuff?

Let's collect our needs: We want:
  • locking
  • an easy to use interface
  • threadsafe would be perfect
  • perhaps asynchrony access
  • some kind of remote procedure calls
  • perhaps some kind of caching? 

This should all be possible to write in a reasonable amount of time. The caching could be a challenge, but I would love to see a 64-Bit implementation of this, that is useable from a 32-Bit application, so the always empty 14 GB of spare memory could finally be filled with something useful.

I think I will start with a nice slim socket implementation, using UDP Broadcast to find other clients or "Server". Then connect over TCP, implement a simple low lever protocol for the handshake, ping, and version checking. Perhaps a plugin system that is able to auto-reload new versions from a server.

Yes this is all doable...

Why reinvent the wheel? My answer is as always: Because my wheels are running better. Or at least I think so... 

One big problem is still in my way: Find the time for doing this.

Perhaps you would like to see me live on YouTube trying this? Or my break down, because I have underestimated the problem... In any case, please leave a comment and subscribe to my channel.

I have to travel to a planet with a lower rotation speed!




  


Tuesday, May 5, 2020

Is a database just a data storage?

In the old days, and remember the main title of my blog: "from old school...", data would be saved in files. For the new kids on the block: A file is a storage on your hard disk, a hard disk was a device with spinning disks inside. A read/write head could store and read bytes to and from it.

So in these days we just write a block of bytes to these files, we used the "BlockWrite" command to do this! Why, because it is and was the fastest way to write a record binary to disk.

And NO, streams are not faster. Because down to the RTL, a Filestream is using the same functions, but need more calls to get there. Maybe you call "BlockRead" and "BlockWrite" the old style, but I don't care.

Before we got hard disks, we hat floppies. You got the best performance out of a floppy if you could provide a buffer to read the whole track in one rotation. If your CPU or your floppy controller was not fast enough, the sectors on the track had to be interlaced, and in this case you needed more than one rotation - too bad.

What was the title of this post?

Oh yes. We stored data, mainly records, in binary files. Sometimes we had an index. The index was a string and a seek-number. We load the index file, found the matching string, and used the seek-number to find the record in the binary file. If we had this index, we called it a database.

What about the performance? Besides the algorithm of indexing, the database should also load the data from disk and uses the same OS-functions to do this. I assume, a "normal" database that is using a file to store the data needs more than one block read, and on the client-side? For a dataset with 100 fields you have to write 100 times: FieldValue := Query.FieldByName('FieldName').AsString. This is so awful slow... With one "BlockRead" I get 10kb in a record with 1000 Fields in a blink of a nano-second. (or less). Just one call!

Perhaps knowing all this I use a database nearly the same way as in the old days. The CRUD-Way. Just do Create, Read, Update, and Delete!

That's why I could migrate all my applications to a REST-Server in minutes.

Yes I've used "Join" once or twice, and also a trigger or stored procedure, but just because somebody told me: "Let this do the database-server, the database server could do this better". In some cases this is absolutely right. Especially if you're dealing with really big datasets or/and your database is on a remote computer. That's for sure! Sending an update to a table with constraints is much easier as doing this with "Blockwrite" no question!

To have a session-based I/O while updating the customer-, the invoice- and the stock table in one call and if anything goes wrong just use rollback and not commit. Oh man that helps a lot.

In a few cases, I only read some fields of a row, but most of the time I need all fields. So, "Select *.." is the call. After I got all the data I need the mentioned field by field assignment.

That why I've programmed my JSONStore Client-Server Databasehandler in my Firemonkey Development Kit. I know the name is bad - you can uses this unit also in your VCL application!
Just select on which fields you want to have database access, all other fields are store in a blob field. Of course, I compressed the JSON before storing it. After loading the data set from the server (over REST) or from a local database you have to read your database fields the normal way and after that just let the RTTI do their JSONToObject thing. Done...

Hey compare this to the old style! A database with some keys and a blob field that could be read and write all the data in just one call to our Record/Class. We are back in the '80s well done!

One thing is different: In these days we have 5GHz, a 64Bit bit CPU, and most of the time 8 cores or more, and not 3 MHz, an 8 bit CPU, one core and only 64KB (not MB, not GB) of RAM.

But we are lucky, because with all that memory, cores, and CPU clock speed we can read our data from the database at the same time/speed as in the '80s...

I love DB's...






Thursday, April 23, 2020

Live event : The Apocalypse Coding Group.

Don't miss Part 7 & 8 of the Apocalypse Coding Group on 25. & 26.04. 14:00 UTC.

With:
Andrea Magni
Craig Chapman
Glenn Dufke
Ian Barker
Jim McKeeth
an me...

Where MVP's are trying to convince the viewers that they are worthy of this title, although it did not look like this in the Live-Stream 1-6 with 4 hours each.
It's fun and you can annoy us in the live chat!

Part-7 : https://youtu.be/eJL_kp92N1Q
Part-8 : https://youtu.be/oh48IoNi9OI

The Live-Stream is on Craig Chapman channel! Please don't forget to subscribe to his and my channel so you don't miss the upcoming events, we are currently planning together!

Legacy Applications

OK - here is the problem:

Old Application started with TP 3, grown to many Mio. LOC.
Full of Moves, Records, and other stuff that is not ready to move from pre Unicode to Unicode.

That means no real RTTI, Generics, FireDac, native HTTP, ITask, and other stuff you love if you are using 10.x. 

With a look at the roadmap ;-) the new records would help, but anyway it's a huge task to get it running with XE.

The first idea was to use a DLL - of course -  this works, but not really, because of no real working share mem and an FMX-DLL has also problems. 

So you have to serialize everything over to and back from the DLL. If you have to serialize everything you could do this also over TCP. 

The multi-user network sharing is working but should have some improvements.

A local Database would also be a good idea and take out the old Enz-ISAM that I've ported to windows a long time ago.

Installing a real DB-Server is not possible. Any options?

I could install a Service on each workstation in the network.

What can a Service do for you?
First of all - no problems with admin rights anymore. Installation of anything else is a piece of cake.
The Service Apps of each workstation could talk over a TCP connection to each other. So without a dedicated Server, the Service-Apps could name one as "The Server" if the workstation is going to do a shutdown, another workstation could be named as "The Server" from now on.
Every running App could ask the local server on 127.0.0.1 - "Hey give me the IP of the Server". No need for configs. instead of 4GB the Server could use all the memory and load DLL for different tasks. 

And the client-side?
The client can use a simple interface for the Service, like OpenDB, LockTable, WriteData, UnlockTable, and CloseDB (CRUD with locking) - Every command must (again) be serialized over TCP. The Server could maintain the locking for each table. Vola - a dedicated SQLite Server, or any other DB. (And of course many more).
Internet-Updates, DBCache, and any other service that is much easier to write in XE than in D2007.

This is the Idea...

Do you want to see me struggling to implement this? Perhaps on a live Youtube-Session?
Or are your more into FMX and MVVM?

Anyway - please subscribe to my YouTube Channel and leave a comment on what you want to see next?

Have a nice separation...