Results 1 to 6 of 6
  1. #1
    Join Date
    Aug 2010

    Jumping back into this again

    Hi all,

    I decided to jump back into this project last week and make some changes and I was wondering if anyone is interested.

    Here is what I am currently working on.

    1. Move to supporting Task, Task<T>, and an async await format instead of Deferred. Unity supports this now. I left Deferred in and created extensions to wrap either inside of the other. COMPLETE
    2. Wrapped Unity's Co-Routine so that it can be async awaited as a Task. COMPLETE
    3. Re-organizing the project as it got a little bit jumbled towards the end. ALMOST COMPLETE
    4. Getting rid of any Photon references in the framework and game level. Everything will be interface driven with only the top layer injecting the Transport it needs. INVESTIGATING
    5. Getting rid of Photon's IFiber for concurrent action execution. Created a custom IConcurrencyContext interface and concrete ConcurrenyThread class to take its place. COMPLETE
    6. Changing the code gen system so that it is dynamic in built in with hidden files. INVESTIGATING
    7. Moving everything to .Net Standard so that it can run on anything (transport dependent). TESTED ON A COPIED PROJECT WITH SUCCESS
    8. Get rid of all checked in binaries and move to nuget based packages (Planning on making a set of Photon nuget packages for easy hosting and deployment) IN PROGRESS
    9. Add a scale-able NoSQL style dataset / actor framework for non relational data. INVESTIGATING
    10. Website will be rebuilt with ASP.Net MVC Core and a lot of design changes.
    11. Look into Unity's new Entity Component System that will replace MonoBehaviours in the next year or two.

    If anyone is interested I could share the code potentially. Don't know where I will host it as I use Visual Studio Team Services for source code / project management and I host everything on Azure.

    I am going out of town for 2 weeks, but I am going to be planning this out and trying to execute on this when I return. Just wanted to get a feeler out for if anyone was interested.
    Last edited by am385; 08-06-2018 at 12:32 PM. Reason: Typo

  2. #2
    Join Date
    Feb 2014
    So glad someone is doing stuff. Although I keep coming here in hope, I know in my heart that this site is not going to feed my hunger for c# any more.

    I kinda like what I think your option 6 is doing. I am not a big fan of the generated files which make re-compiling a real annoyance, so are you trying to get rid of that one? Every other way I've seen it done is to hardcode a bunch of codes in enums. What's your idea here? (If you answer this question, PLEASE dumb it down for your target audience!)

    Am I right in thinking that a lot of Microsoft's direction is to steer developers away from feature-rich things towards common multi-platform libraries? .NET seems to be developing in both directions. Just what sort of clients do you see becoming available by going down the restricted feature-set path? Supported ipad-like interfaces is not to my personal taste which is where it looks like Windows10 is going.

    I've been a bit lost and de-motivated for the past 6 months. My sensible side is telling me to get back into coding again. My entertainment side tells me to keep playing Lord Of The Rings Online!
    Last edited by oldngrey; 07-19-2018 at 12:53 AM.

  3. #3
    Join Date
    Aug 2010
    Back from being out of town and going to start thinking more about this all.

    I'll try to answer your questions though.

    Codegen will be similar to what it is now, but instead of actually creating a file in the project that is created by another exe that we build, I am planning to create a target that will just invoke the codegen on the running .net environment that is present during build. It will no longer take a dependency on an exe that we build. It will leverage a dll that we build but it should not cause this strange locking issue. The codegen file will be injected before build but will not be a part of the project it self. I am also hoping to get a naming system that will hold up better than always randomly generating the class names. Either something that is deterministic like having it hash based where it will only change if the contents change. Still thinking this through, but I am researching another project that does this gracefully.

    We don't want to have conditional logic in our message handling that is unnecessary. By using the system that we currently have, that is reflection based code gen, we do some logic on load and never have to go through a conditional tree at runtime to determine exactly what we need to do. (Granted our mapped method system does have some logic). This also makes it much simpler and cleaner to write.

    We also need to break up the codegen into pieces that make sense. For example, servers to server should not include server to client stuff and more importantly, server to client code should have nothing more exposed then needed as it can open security / hacking issues.

    In essence it will do exactly the same thing while giving us a hidden output file and not lock the build system.

    Microsoft's reasons are not about pushing a developer one way or the other, but rather giving them options to go outside of the current platform they develop for if they want. Both .Net Framework, .Net Core, and the .Net Standard definition are being developed in parallel. .Net Framework will always have windows only features only available to it but we don't really use those currently.

    So .Net Standard is literally just a Standard and not a runtime. It defined a set of APIs that must be present to be compliant with the version of the standard you are targeting. This means we can target both .Net Framework and .Net Core with the same code base.

    .Net Framework is robust but it also has some downfalls in that some behaviors were created that end up having consequences in performance that can not be changed because it can cause a breaking changes to anyone that is developing around those behaviors with an expected outcome. Remember that a machine only has 1 version of the .Net Framework installed and it is backwards compatible with the previous versions. If a .Net Framework 2.0 binary was executed, it is still running in the current version of .Net Framework installed on the machine (.Net Framework 4.7.3 or something like that right now). This means that they can fix a bug, but not change the outcome or behavior of an API. Yes it is more feature rich currently but that is also changing as .Net Core is growing quickly which is pushing the growth of the .Net Standard as well.

    .Net Core is more performant than .Net Framework. Some of the design flaws mentioned above can be removed as you are able to package the full .Net Core version that you want to run with your project. It is also cross platform.

    Currently this is purely future focused as Photon is our current transport and it runs on .Net Framework and Unity is moving towards .Net Standard as they are cross platform and Mono is part of the .Net Standard as well.

    In my case everything will be a .Net Standard 2.0 with the exception of the entry points as both .Net Core 2.1 and .Net Framework 4.7.2 implement the .Net Standard 2.0 APIs.

    As an example, the servers will all be basically the same, but for photon there will be an additional dll sitting on top of them that is a .Net Framework based project that simply is the entry point. This will mean at build time the output for that DLL will include everything it needs to run to give us the same dependencies as it had before.

    Since I am planning to create a Transport Interface that is very similar to Photon, this would mean that it would be trivial to have Photon as our transport and essentially transparent. In the future we could create our own Transport that is .Net Standard compliant that will run on .Net Core 2.1 on a Linux server for example. We get the benefits of more performance for our server applications as well as being able to choose the platform that it runs on.

    So again, nothing really changes that much. It is essentially semantics for now until we can take advantage of it, but it allows us to be more open before we close our selves off through a restrictive set of dependencies and platforms.

    Mine has been PUBG for the last year or so. Way to many hours in that. Well that and work. Crazy hours these last few months with Feature Complete deadlines, Bug fix deadlines, and shipping about to happen. But hey, at least I have a tiny feature in a piece of software that hundreds of millions of people use so that is cool I guess.
    Last edited by am385; 07-31-2018 at 11:04 AM.

  4. #4
    Join Date
    Feb 2003
    Arlington, Va
    Spounds interesting. I will be eager to hear of your process. I would love to work through the MMO tutorials.

  5. #5
    Join Date
    Aug 2010
    I have gotten rid of Photon in the framework layers but more on that in a minute.

    I think the biggest issue here is that we have combined too much stuff into what an MMO framework is. The reason is that we did not and were not making an MMO framework. What we spent most of the time on was making a robust developer friendly RPC (Remote Procedure Call) framework. Most of the code revolves around creating these interfaces and implementations of systems that have a local side and a proxy side and can be called remotely through our systems by converting the call down to an operation code, a mapped method, and a byte array of data through a few different serializers and then send that over a transport. By doing a bunch of code gen at build and reflection at startup we are able to remap every call to its proxied method on the other side of the transport. From there we built a set of specialized operations that can instantiate the systems that we want to be listening on the other side as well through our factories.

    From there we started to build game servers, clients, a launcher, a data repository, and a website on top of this RPC framework directly. There should be a layer in between these two that would be the MMO Framework. It would be a layer that defines what Servers, Zones, Regions, Items, Interest Areas, and Interest Management are. Hence, there was no MMO Framework. I think that the idea was to eventually abstract some of this later on and patch in a generic framework but the series is now gone.

    That aside, when I said before that I have removed Photon from the framework layers, I meant the RPC layers so far. Next I will be removing if from the higher levels as well but I need to flush out the lower levels and push the dependency upwards and out so that I can easily test and ensure the functionality is still the same.

  6. #6
    Join Date
    Aug 2010
    I picked this up again for the past few months and made some interesting changes.

    Replaced Photon with a custom Async Socket Server. Photon can still be used but now any transport layer can be used. I developed a simple TCP Socket Server / Client that can be used instead so that running directly from VS is now possible. Essentially to do this I copied some of the language that Photon was using so that everything is still familiar. There are still events and operation requests but all of the serialization and transformation is handled by my own systems. Now the transport layers only send and receive byte[] and handle the connection. Photon has a SendMessage function which will send an object. This can be used to send the byte[] that is required for a connection.

    Replaced Log4Net with ILogger and Serilog. This was a personal choice but I prefer Serilog. The entire logging system was replaced with abstract logging provided by the Microsoft.Extensions.Logging package and then I am using a Serilog provider for that package. This means that anyone wanting to still use log4net could still do so by leveraging a log4net logging provider.

    Code generation now creates the file in the obj directory before build instead of in the project directly. This means that it is no longer checked into the project. I still want to do some work here with the naming of the files so that it is static.

    Moved to .Net Standard for supporting libraries. The entire framework is now .net standard with only the top levels being either .Net Framework or .Net Core depending on what is consuming them. For example, the Photon version of the server classes are .Net Framework and my custom Socket Server is .Net Core. For the Unity project I am using .net framework as it pulled all of the supporting dlls correctly.

    Still going to work on this more and more. Right now my project looks very different from where Nelson left things off.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts