Category: Blog

  • contracts

    CodeforDAO Contracts

    Base on, build upon and code for DAOs.

    Make DAO the next generation of productivity tools for global collaboration.

    Follow us on Twitter @codefordao.

    mit license

    This project is a work in progress, it has not been audited for code security and is being deployed in local development and test networks and is not gas optimized at this time. Please use with caution.

    The CodeforDAO contract is a set of DAO infrastructure and efficiency tools with member NFT at its core.

    It is centered on a set of membership contracts for the ERC721 protocol, the creation of its counterpart share contracts, two parallel governance frameworks and a timelock vault contract.

    It introduces some basic features to DAO, including vault contracts for self-service participation in investments, a set of modular frameworks to support aggressive governance.

    Core Concepts

    We believe that DAO will be a very powerful tool and, for now, it has become an important concept trusted by developers worldwide and by a large number of users familiar with Web3. Many people who are unfamiliar with each other are spread all over the world, but are contributing to a DAO at the same time.

    But we have to admit that, at the moment, while DAO has achieved an important foundation of trust, its infrastructure is still very imperfect, especially in terms of efficiency tools, and we cannot guarantee that DAO will be able to run continuously and efficiently like any large organization, such as a traditional shareholding company. This is exactly what CodeforDAO aims to do.

    By improving aggressive governance, and continuously contributing infrastructure and modules to the DAO, and even introducing AI automated governance, we hope to take the DAO to a new level of efficiency.

    Structures

    This is a brief introduction to the structure of the contract, if you are interested in more, please read our contract code, which is located in the ./contracts folder.

    Membership

    The Member Contract is the entry point for other subcontracts, and it is an ERC721 contract. It includes a simple allowlist invitation function and provides the investMint(to) method to ensure that external investors get a corresponding membership (similar to a board of managers)

    Share

    The share contract is a simple ERC20 full-featured contract, the ownership of which will be delegated to the vault contract upon creation

    Treasury

    After the initialization function is completed, the vault contract is the owner of all contracts. It stores all the assets and contract permissions of the DAO. It provides an invest method that allows external investors to participate in the financing of the DAO and issues corresponding shares for these investors. The vault contract also operates with a specific module, which is authorized to use some of the assets for the daily management of the DAO.

    Governor

    The governance contract allows voting using ERC721 and ERC20, after the initialization function is completed we will have two governance contracts, one supporting voting using the member NFT (1:1) it’s role is similar to the founding team voting and the other supporting voting using shares (similar to class B shares on the board)

    Module

    The core module contract provides a set of methods that allow modules and vaults to interact. At the same time, it is an actively governed multi-signature contract that allows proposing, confirming, scheduling and executing module-related operations, and you can see the usage of these hook functions in specific modules.

    Get started

    In order to start developing, you must be familiar with some basic knowledge of smart contracts and install the corresponding development environment.

    $ npm install

    Note: these smart contracts are not designed to be library contracts, and you can fork these contracts locally to modify them yourself, rather than importing them directly by a git link.

    If you encounter a dependency conflict during installation, this is due to the version number of the hardhat-deploy-ethers module being incompatible with the @nomiclabs/hardhat-waffle required @nomiclabs/hardhat-ethers module version number. Make sure to add --force flag to $ npm install to resolve this problem.

    Membership NFT

    Currently, the membership NFT contract (contracts/contracts/core/Membership.sol) is the entry point for all contracts and the creator of all contracts.

    This means that deploying this contract will deploy a full set of DAO governance contracts, including the vault, an ERC20 token contract, and two sets of governance contracts.

    After deployment, you need to call the setupGovernor method to release important permissions and hand them over to the vault contract, which secures the governance of the DAO.

    Note: In the future, the way the membership contract is initialized may change, and in order to optimize gas fees, we may modify it to allow external scripts to modify permissions.

    Run the npm run deploy:test command to deploy the contract, or you can refer to the . /tests folder for test cases.

    Work with Web UI

    To use the contract with the Web UI, we need to run the hardhat network locally and export the ABI of the contracts in the directory where the UI is located, which by default is: ../website/contracts/.

    To run the hardhat network locally with contracts deployed:

    $ npm run dev

    You could also set a TEST_STAGE to npm run dev, the stage flag is for Web UI testing propose as it will make sure localhost blockchain starts with contracts states.

    $ npm run dev:mint_ready

    This is same as:

    $ TEST_STAGE=MINT_READY npm run dev

    And, add the MulticallV1 contract address to .env.local of the Web UI project without 0x prefix. (Make sure to replace with the address you just deployed, not the sample one)

    NEXT_PUBLIC_LOCALHOST_MULTICALL_ADDRESS=Cf7Ed3AccA5a467e9e704C703E8D87F634fB0Fc9
    

    This is useful for third party web module like multicall.js or useDapp

    Extending Modules

    Modules are an important part of aggressive governance. By writing your own modules, you can expand any business to be part of DAO.

    Using the Payroll module as an example, we can take a look at how to write our own module.

    contract Payroll is Module {
      using Strings for uint256;
      using Address for address payable;
    
      constructor(
        address membership,
        uint256[] memory operators,
        uint256 delay
      ) Module('Payroll', 'Payroll Module V1', membership, operators, delay) {}
    }
    

    By inheriting from the core module, Payroll needs to initialize the constructor of the core module, which will automatically get a timelock contract payroll.timelock().

    The module must pass three parameters, which are the address of the member NFT contract, the list of operator IDs (NFT tokenID) and the time for which the time lock delay proposal will be executed.

    You can easily define structured data and events in the module.

    struct PayrollDetail {
      uint256 amount;
      PayrollType paytype;
      PayrollPeriod period;
      PayrollInTokens tokens;
    }
    
    event PayrollAdded(uint256 indexed memberId, PayrollDetail payroll);
    event Payrollscheduled(uint256 indexed memberId, bytes32 proposalId);
    
    mapping(uint256 => mapping(PayrollPeriod => PayrollDetail[])) private _payrolls;

    The application module must perform the proposal function of the core module; simply put, it must implement an external method to allow operators to make proposals.

    /**
     * @dev Schedule Payroll
     * Adding a member's compensation proposal to the compensation cycle
     */
    function schedulePayroll(uint256 memberId, PayrollPeriod period)
      public
      onlyOperator
      returns (bytes32 _proposalId)
    {
      // Create proposal payload
      PayrollDetail[] memory payrolls = GetPayroll(memberId, period);
      address[] memory targets = new address[](payrolls.length);
      uint256[] memory values;
      bytes[] memory calldatas;
      string memory description = string(
        abi.encodePacked(
          _payrollPeriods[uint256(period)],
          ' Payroll for #',
          memberId.toString(),
          '(',
          _payrollTypes[uint256(payrolls[0].paytype)],
          ')',
          '@',
          block.timestamp.toString()
        )
      );
    
      // You can use the methods of the core module to get the corresponding address
      address memberWallet = getAddressByMemberId(memberId);
    
      for (uint256 i = 0; i < payrolls.length; i++) {
        PayrollDetail memory payroll = payrolls[i];
        targets[i] = address(this);
        values[i] = payroll.amount;
    
        // Fullfill proposal payload calldatas
        calldatas[i] = abi.encodeWithSignature(
          'execTransfer(address,address[],uint256[])',
          memberWallet,
          payroll.tokens.tokens,
          payroll.tokens.amounts
        );
      }
    
      // Propose It.
      _proposalId = propose(targets, values, calldatas, description);
    
      // Trigger your event
      emit Payrollscheduled(memberId, _proposalId);
    }
    

    Correspondingly, the application module sea needs to implement specific proposal execution methods. In this case, the method is execTransfer.

    Check the Payroll module to see the detail implementation.

    By default, proposals in the module need to be confirmed by all operators before they can enter the queue and wait for execution. Its lifecycle must go through four stages: proposal, confirm, queue and execution. Since it is aggressively governed, module proposals do not need to go through a full DAO vote.

    The application module’s timelock contract allows the use of the vault’s assets within certain limits, and you can license and invoke these assets with approveModulePayment() and pullModulePayment() in the vault contract. pullPayments() method in core module is also useful.

    Notice: approveModulePayment() require a vote of the DAO

    Running tests

    This project currently uses hardhat-deploy for multiple environment deployments and to increase the speed of testing.

    $ npm run test

    Running spec tests where you can find them in ./test folder

    $ npm run test:membership

    or

    $ npm run test:governor

    If you need a test coverage report:

    $ npm run test:coverage

    About Gas optimization

    The contract code in this project is not currently systematically gas optimized, so they will be quite expensive to deploy on the eth mainnet. At this point, we do not recommend that using them on the mainnet.

    As a result, the base libraries that the contract code in this project relies on will change very frequently and may be replaced by more efficient libraries, but we will try to find a balance between audited reliable contracts and efficiency.

    MIT license

    Copyright (c) 2022 CodeforDAO <contact@codefordao.org>

    Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

    The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

    THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

    Visit original content creator repository
  • Mini-Blockchain

    Mini-Blockchain

    Description du projet

    Une application qui implémente les principales fonctions de Blockchain.

    Principales fonctionnalités de l’application :

    • Exploitation minière de pièces ;

    • Effectuer des transactions entre utilisateurs ;

    • Affichage du réseau blockchain complet stocké par l’utilisateur dans le noeud ;

    • Vérifier si le réseau Blockchain est valide (personne n’a tenté de compromettre une transaction) ;

    • Ajout d’autres nœuds à la liste de l’utilisateur pour les futures transactions entre nœuds ;

    • Résoudre les conflits dans les données stockées par différents nœuds afin de stocker la même version du réseau Blockchain ;

    • Consulter les pièces de coins d’un utilisateur.

    Comment démarrer un projet

    Pour réussir le lancement d’un projet, vous devez suivre les étapes suivantes dans l’ordre :

    1. Lancer le noeud

    Pour démarrer un nœud, utilisez la console et entrez une commande au format suivant :

    python noeud.py filename port

    où,

    • port : numéro de port, par exemple 80

    • filename: le nom du fichier où seront stockées les clés privées et publiques

    Remarque : si vous le souhaitez, vous pouvez exécuter plusieurs nœuds à l’aide de plusieurs consoles.

    1. Lancez l’application
      Pour exécuter l’application, utilisez la console et entrez une commande au format suivant :

    application python.py

    Remarques

    Tip

    Le côté serveur de l’application est écrit à l’aide de la bibliothèque Flask.

    Important

    Toutes les bibliothèques nécessaires qui nécessitent une installation sont stockées dans le fichier requirements.txt

    Note

    Ce code s’est exécuté et a fonctionné correctement sous Windows 10 et Python version 3.12.1

    Warning

    Ce projet a été créé pour se familiariser avec les principes de fonctionnement du réseau Blockchain et son utilisation pour des projets réels est fortement déconseillée.


    Project Description

    An application that implements the main functions of Blockchain.

    Main features of the application:

    • Coin Mining;

    • Making transactions between users;

    • Showing the complete blockchain network stored by the user in the node;

    • Checking whether the Blockchain network is valid (no one tried to compromise any transaction);

    • Adding other nodes to the user’s list for future transactions between nodes;

    • Resolving conflicts in data stored by different nodes in order to store the same version of the Blockchain network;

    • Viewing a user’s coins.

    How to start a project

    To successfully launch a project, you need to take the following steps sequentially:

    1. Launch Node

    To start a node, use the console and enter a command in the following format:

    python noeud.py port filename

    Or,

    • port: port number, for example 80
    • filename: the name of the file where the private and public keys will be stored

    Note: if desired, you can run multiple nodes using multiple consoles

    1. Launch the application

    To run the application, use the console and enter a command in the following format:

    python application.py

    Notes

    Tip

    The server side of the application is written using the Flask library.

    Important

    All necessary libraries that require installation are stored in the requirements.txt file.

    Note

    This code ran and worked correctly on Windows 10 and Python version 3.12.1

    Warning

    This project was created to get acquainted with the principles of operation of the Blockchain network and it is highly not recommended for use for real projects.

    Visit original content creator repository

  • Unity-Design-Pattern

    Design Patterns Written in Unity3D

    This repository is about cool design patterns written in Unity3D C#.

    • Now 23 Gang of Four Patterns have all been finished in Unity3D in this repository.
    • Each pattern contains the corresponding structure implementations, application examples and diagrams. Same way with Naphier/unity-design-patterns, in this repository each pattern is contained in a separate folder. Inside these are a folder (“Structure”) to show what classes are used in the pattern’s structure in Unity3D(with a scene) and a folder or folders (“Example”) showing one or more real-world example of using the pattern in Unity3D along with a scene showing it in action. Each pattern folder may contain one or more Example.
    • Game design patterns from book Game Programming Patterns have been partially implemented.

    此repo为Unity3D中各种设计模式的实践与运用。

    • 目前已经在Unity中实现了《设计模式:可复用面向对象软件的基础》一书中提出的23种设计模式。
    • 每种模式都包含对应的结构实现、应用示例以及图示介绍。类似Naphier/unity-design-patterns的结构,此repo中的每种模式用单独的文件夹分开。每种模式对应的文件夹中包含了名为“Structure”的子文件夹,里面存放的是此模式在Unity中的使用代码的基本框架实现,而另外包含的Example子文件夹中存放的是此模式在Unity中使用的实际示例。每种框架实现或实例示例实现都包含对应的场景,每种模式文件夹中可能包含一个或者多个Example。
    • 游戏编程模式》一书中介绍的常用游戏设计模式的Unity版实现也有部分实现。

    Contents

    I、Gang of Four Patterns in Unity (23种GOF设计模式的Unity实现)

    Behavioral Patterns 行为型模式

    Structural Patterns 结构型模式

    Creational Patterns 创建型模式

    II、Game Programming Patterns in Unity (《游戏编程模式》的Unity实现)

    Reference resources(参考)

    Visit original content creator repository

  • RxPullToRefresh

    Carthage compatible Version Platform Build Status codecov Reviewed by Hound Language Packagist

    RxPullToRefresh

    A Swift library allows you to create a flexibly customizable pull-to-refresh view supporting RxSwift.

    drawing

    Features

    • Support UIScrollView, UITableView, and UICollectionView
    • Customizable refresh view
    • Customizable animaton options
    • Configurable option whether to load while dragging or to load after an user release a finger
    • Error handling
    • Support RxSwift/RxCocoa

    Requirements

    • iOS 10.0 or later
    • Swift 5.0 or later

    Installation

    Carthage

    Add the following to your Cartfile and follow these instructions.

    github "gumob/RxPullToRefresh"        # Swift 5.0
    github "gumob/RxPullToRefresh" ~> 1.0 # Swift 5.0
    github "gumob/RxPullToRefresh" ~> 0.1 # Swift 4.2
    

    Do not forget to include RxSwift.framework and RxCocoa.framework. Otherwise it will fail to build the application.

    drawing

    CocoaPods

    To integrate RxPullToRefresh into your project, add the following to your Podfile.

    platform :ios, '10.0'
    use_frameworks!
    
    pod 'RxPullToRefresh', '~> 1.0'   # Swift 5.0
    pod 'RxPullToRefresh', '~> 0.1'   # Swift 4.2

    Usage

    Read the API reference and the USAGE.md for detailed information.

    Basic Usage

    Import frameworks to your project

    import RxSwift
    import RxCocoa
    import RxPullToRefresh

    Add RxPullToRefresh

    Create a RxPullToRefresh object.

    // Create a RxPullToRefresh object
    self.topPullToRefresh = RxPullToRefresh(position: .top)
    // Add a RxPullToRefresh object to UITableView
    self.tableView.p2r.addPullToRefresh(self.topPullToRefresh)

    Observe RxPullToRefreshDelegate

    By observing RxPullToRefreshDelegate, you can watch the state of a RxPullToRefresh object. This delegate is get called by the RxPullToRefresh object every time its state or scrolling rate is changed.

    // Observe RxPullToRefreshDelegate
    self.topPullToRefresh.rx.action
            .subscribe(onNext: { [weak self] (state: RxPullToRefreshState, progress: CGFloat, scroll: CGFloat) in
                // Send request if RxPullToRefreshState is changed to .loading
                switch state {
                case .loading: self?.prepend()
                default:       break
                }
            })
            .disposed(by: self.disposeBag)

    Load and append contents

    self.viewModel.prepend()
                  .subscribe(onSuccess: { [weak self] in
                      // Successfully loaded, collapse refresh view immediately
                      self?.tableView.p2r.endRefreshing(at: .top)
                  }, onError: { [weak self] (_: Error) in
                      // Failed to load, show error
                      self?.tableView.p2r.failRefreshing(at: .top)
                  })
                  .disposed(by: self.disposeBag)

    Disable refreshing by binding Boolean value to canLoadMore property

    self.viewModel.canPrepend
            .asDriver()
            .drive(self.topPullToRefresh.rx.canLoadMore)
            .disposed(by: self.disposeBag)

    Dispose RxPullToRefresh objects

    override func viewDidDisappear(_ animated: Bool) {
        super.viewDidDisappear(animated)
        self.tableView.p2r.endAllRefreshing()
        self.tableView.p2r.removeAllPullToRefresh()
    }

    Advanced Usage

    About the example project

    RxPullToRefresh allows you flexibly customize a refresh view by inheriting RxPullToRefresh and RxPullToRefreshView classes. Please check example sources for advanced usage.

    Build the example app

    1. Update Carthage frameworks
    $ carthage update --platform iOS
    1. Open RxPullToRefresh.xcodeproj
    2. Select the scheme RxPullToRefreshExample from the drop-down menu in the upper left of the Xcode window
    3. Press ⌘R

    Copyright

    RxPullToRefresh is released under MIT license, which means you can modify it, redistribute it or use it however you like.

    Visit original content creator repository
  • csvParser

    Lint Code Base C/C++ CI CodeQL

    A parser for csv files in C, for C and C++

    This is an RFC 4180 parser for reading csv files for C and C++. It also handles csv files created by Excel, including multi line cells. The latest change allows it to read very large csv files ( tested to 500,000 rows )

    The csv file is read into a linked list of rows, each row has a linked list of cells for that row. After creation of this list, an array of row pointers is created to allow fast lookup.

    This allows each cell be access directly and in any order.

    ////////////////////////////////////// // The rows are a linked list // Each row has a linked list of cells // R – C – C // | // R – C -C – C – C – C // | // R – C -C – C – C // | // etc //////////////////////////////////////

    There are just 3 functions, one to read the file and one to get access to the data in each cell, and one to free the memory used when done. It compiles fine using C or C++, so it can be used for C and C++.

    CsvType *csvPtr = readCsv(filename, csvSeperator);

    cell = getCell(csvPtr, row, column);

    void freeMem(csvPtr);

    See csvTest.c for a C example of reading and recreating the csv file to stdout.

    For the C++ version, There is a simple class defined

    See csvTest.cpp for a C++ example of reading and recreating the csv file to stdout.

    I have written several csv parsers over the years. These parsers used the troublesome C functions ‘strsep’ and ‘strtok’ either I was using them wrong or they are buggy under a high load, they also behaved in different ways on different machines and OS’s. This csv parser does the tokenising itself, which proved to be easier than getting ‘strtok’ to work reliably. The code reads the csv and into a tree like structure, a linked list of rows, each row has a linked list of cells. This makes it fast and easy to access each cell which can be accesses via row and column.

    Whilst it is very fast, It should read files of any size, limited by how much memory you have. For extremely large files, it now uses an array of row pointers to quickly find the correct row. It has been tested on csv files with 500000 lines.

    I used to roll my own linked lists before the C++ STL came along, and it was fun for me to write code using linked lists again.

    I have tried to make it RFC 4180 compliant. This repository consists of two files csvParser.h csvParser.c

    — for testing and an example csvTest.c

    The third file csvTest.c can be used as an example and for testing The csv file is read by this function

    CsvType *csv = readCsv(filename, csvSeperator); Cells can be acceses via this function cell = getCell(csv, row, column);

    for C++, there is a simple class defined, but it basically uses the c code. csvParser.h csvParser.cpp (This is the same file as csvParser.c) csvTest.cpp

    The csv file is read by the this function CsvType *csv = readCsv(filename, csvSeperator); Cells can be acceses via this function cell = getCell(csv, row, column);

    The returned cell data structure has enough information to tell is the cell exists, if it is empty and if it the last cell in the list (the end of the line) Anyway see csvParser.h and csvParser.c

    See csvTest.c as an example

    Memory Leaks? I could not measure any. 🙂

    Possible improvements:

    1. Allow single quotes as well as double quotes
    2. Windows based files will loose the the Line feed characters, this might be an issue for multiple line cells.
    Visit original content creator repository
  • usbpv_demo

    USB Packet Viewer demo

    Demo plugins for USB Packet Viewer.

    Packet Capture

    A dll for capture data, must implement 1 lua function, 3 dll function.

    Function in lua

    If there are more than one capture sources, USB packet viewer will display a capture select dialog.

    local captureList = {
        "demoCap", -- demoCap.dll
    }
    function valid_capture()
        return table.concat(captureList, ",")
    end

    Function in dll

    typedef long  (__cdecl* pfn_packet_handler)(void* context, unsigned long ts, unsigned long nano, const void* data, unsigned long len, long status);
    
    USBPV_API long  __cdecl usbpv_get_option(char* option, long length);
    
    USBPV_API void* __cdecl usbpv_open(const char* option, void* context, pfn_packet_handler callback);
    
    USBPV_API long  __cdecl usbpv_close(void* handle);

    File Read/Write

    File read/write use lua script, must implement 3 function, valid_filter, open_file, write_file.

    function valid_filter()
        return "example file (*.example);;All files (*.*)"
    end
    function open_file(name, packet_handler, context)
        --  open and parse the file
        --  packet_handler(context, ts, nano, packet, status or 0, current, total)
        return count -- total packet count in this file
    end
    function write_file(name, packet_handler, context)
        local count = 0
        while true do
            local ts, nano, packet, status = packet_handler(context)
            if ts and nano and packet then
                status = status or 0
                -- TODO: write packet to file
                count = count + 1
            else
                break
            end
        end
        return count
    end

    Packet Parser

    File read/write use lua script, must implement 3 function, parser_reset, parser_append_packet, parser_get_info.

    function parser_reset()
        -- reset parser state
        -- no return value
    end
    function parser_append_packet(ts, nano, pkt, status, id, transId, graph_handler, context)
        -- parse the packt
        -- append parse result to graph view, see scripts.lua for graph description format detail
        graph_handler(context, graph_description, transId, id, id1, id2, id3)
        -- no return value
    end
    function parser_get_info(id1, id2, id3)
        -- id1, id2, id3,  id of the graph element
        -- data, data to display in data view
        -- html, html info to display in decode view
        return data, html
    end

    Visit original content creator repository

  • Conv-KR.NET

    Conv-KR.NET project

    If you want access to readme Italian translation click into readme translation

    It is a conversion project from VB6 to C # of a very old project called Kripter (or KR) which is used to encrypt every single file with an encryption algorithm with Exclusive Or. Being able to encrypt every single file could have been useful with backup utilities that make the difference between files; if I decrypt the folder I modify a file and encrypt again when I make the backup only that modified file is updated.

    The program, being old, has several inaccuracies and non-optimal programming methods that follow my little experience of the time. For example using non-clean code of the prefix str for all strings.

    The solution, called KR.NET, is a simple conversion, probably in the future I will create a new repository where I will evolve the program to improve it from the clean code point of view (for example by eliminating the str prefixes) and adding new features to avoid for example to use symmetric cryptography which is easily decryptable.

    In the [Kripter] (Kripter) folder there is the Kripter VB6 project while the other folders are the project or better the solution converted to C# where each folder is a c# .net framework project.
    The VB6 project had the advantage that it was originally modularized using the modules of VB6 so the subsequent conversion was easier.

    Repository commits are in Italian so i could create a new project with English translations in the future.

    Creation of the unit test [project] (KR.NET/KRTest)

    Given the modularity of the VB6 source it was easy to create using [Microsoft Visual Studio Test] (https://docs.microsoft.com/it-it/visualstudio/test/using-microsoft-visualstudio-testtools-unittesting-members-in-unit-tests?view=vs-2022) a set of tests that could verify the correctness of each converted vb6 module.
    Among other things, this has favored the correction of bugs so that the creation of interfaces using the module functions was limited to the creation of the controls and the conversion of only the code relating to the events and functions / subroutines internal to the forms.

    Having been a manual conversion I was positively pleased to have done it before adding the forms instead of after. Maybe if you have a reliable tool (in the future I could try [VB Migration Partner] (https://www.vbmigration.com/)) I will only add tests after checking the interfaces and discovering bugs.

    The launch of the tests requires first of all the creation of the test environment which is a folder containing the projects themselves in which these have been encrypted with the VB6 executable in a subfolder. This test is the First method found in [UnitTestCore.cs] (KR.NET/KRTest/UnitTestCore.cs).

    Detailed considerations on conversion

    The conversion was done in a punctual way, however some parts I had or wanted to put in place , in particular below the detail.

    The original project in some cases used winapi but in
    commit and in the commit I replaced them with the equivalent of the .Net framework.

    In general, I fixed some bugs that I found without changing the correct function, such as in commit.

    Towards the end of the project all the converted VB6 modules I had to bring them into a library project as there are more VBPs that use them in the VB6 project. So this was the only way to avoid having to duplicate them. The commits highlight start with ** Project Conversion KR Livio: Added Class Library with Modules **.

    I added some project specific parts that were not included in the conversion process; these changes were necessary to make the KR.NET project usable without problems, in particular for the block checkbox that prevents encryption if it is not deliberately deselected, see [commit] (https://github.com/Livio74/Conv-KR.NET/commit/39d820a07a5e0dcb66f7c948b561a8f9c122bada).
    I also added a check that prevents from decrypting important folders like C:\Windows etc, below are the commits:

    • [commit] (../../commit/a2e7d78e0ac3ca9c018a64eecb1fb1105c9fd631)
    • [commit] (../../commit/808129a9b2228072f4e99fb187163571b3449264)
    • [commit] (../../commit/b3ee31f7c9a6e21b99700a56d0fc2a330d789e94)
    • [commit] (../../commit/a6355845d4b8f5da4443e5307f999db7d671f808)

    Instruction for using the project and launching tests

    The project was built via Visual Studio 2019, I guess it should work with later versions as well.
    Therefore it is sufficient, after installing visual studio, to download the repository and open the solution containing all the converted projects.
    Once downloaded, you can rebuild it and run the tests. Launching the test creates by default a C:\KRTest folder configured by default in the [configuration file] (KR.NET/KRTest/test.runsettings) with the parameter called “WorkTestRoot” which can then be changed at will.
    The launch of the tests requires first the single launch of the test method the First method present in the “UnitTestCore” test and then the launch of the whole suite to avoid that the second launch cannot be performed first and therefore causes the tests to fail.

    Note

    I have purposely chosen to include the generated executables and installation files. If they are not needed, I always have time to remove them.

    Visit original content creator repository

  • energiapro_gas_consumption

    hacs_badge

    This repo or its code author is not affiliated with EnergiaPro.

    HACS configuration

    Make sure that you have the AppDaemon discovery and tracking enabled for HACS.

    Breaking change, new way to get data

    Over the past few months, EnergiaPro has introduced changes to their customer portal, the latest being CloudFlare Turnstile, an invisible reCaptcha mechanism to prevent automated bot to do… what I was doing :-/ Even though legit requests, this service detects bot activity and login will not work.

    EnergiaPro now has an (unadvertised) API

    But all is not lost. While not advertised, there is an API available!

    Until EnergiaPro officializes and socializes the API, you can reach out to them at clients@energiapro.ch to get more information for the API service and ultimately get the new set of credentials to make this AppDaemon work.

    Energiapro pre-requisite

    • Your gas installation is already equipped with EnergiaPro’s LoraWan equipement.
    • You possess API login credentials.

    If you are not equipped with the LoraWan stuff, you should be able to contact EnergiaPro and request its installation and configuration at no charge.

    You will need to have the following information for configuration:

    • Your installation number, which you can find in the customer portal or on your invoice.
      • As you configure this number for this app, the format looks like 123456.000
    • Your client number, which you can find on your invoice
      • The format is more like 123456

    AppDaemon’s python packages pre-requisites

    Make sure you have the following python packages installed:

    • (deprecated, can be removed for use with the API) xlrd
    • (deprecated, can be removed for use with the API) pandas
    • (deprecated, can be removed for use with the API) beautifulsoup4
    • requests
    • bcrypt

    Configuration

    secrets.yaml

    You will need the following in your secrets.yaml file

    (deprecated, can be removed for use with the API) energiapro_email: <YOUR_EMAIL>
    (deprecated, can be removed for use with the API) energiapro_password: <YOUR_PASSWORD>
    energiapro_installation_number: "<YOUR_INSTALLATION_NUMBER>"
    energiapro_client_number: "<YOUR_CLIENT_NUMBER>"
    energiapro_bearer_token: <HA_LONG_LIVE_TOKEN>
    energiapro_api_base_url: "https://web2.holdigaz.ch/espace-client-api/api/"
    energiapro_api_username: "<API USER NUMBER>"
    energiapro_api_secret_seed: "<SECRET COMMUNICATED TO YOU BY ENERGIAPRO>"
    

    Don’t forget to put your installation number between double quotes to avoid yaml truncating it.

    apps.yaml

    Define your app like the following. You can remove the deprecated secrets per the above too.

    energiapro_gas_consumption:
      module: energiapro_gas
      class: EnergiaproGasConsumption
      energiapro_base_url: https://www.holdigaz.ch/espace-client
      # energiapro_email: !secret energiapro_email
      # energiapro_password: !secret energiapro_password
      energiapro_bearer_token: !secret energiapro_bearer_token
      energiapro_installation_number: !secret energiapro_installation_number
      energiapro_client_number: !secret energiapro_client_number
      energiapro_api_username: !secret energiapro_api_username
      energiapro_api_base_url: !secret energiapro_api_base_url
      energiapro_api_secret_seed: !secret energiapro_api_secret_seed
      # ha_url: http://localhost:8123  # optional, in case hassplugin ha_url undefined
    

    The energiapro_bearer_token refers to a long-lived Home Assistant token, to post the result.

    Manually trigger the app

    The app can register an endpoint at energiapro_gas_consumption, which was mainly used during development. It’s been commented for “production”.

    If you want to trigger a run manually, uncomment the necessary line in the initialize method and you then can call that endpoint, such as:

    $ curl -XPOST -i -H "Content-Type: application/json"  http://<YOUR_APPDAEMON_IP>:<YOUR_APPDAEMON_PORT>/api/appdaemon/energiapro_gas_consumption -d '{"action": "Call of Duty"}'
    

    Troubleshhoting

    No error, but no data either

    • Make sure you’ve configured your installation number within double quotes and that it is the right number.

    TODO:

    • how to backdate for previous day? (e.g. come up with good SQL probably)
    • Load historical data

    Visit original content creator repository

  • sight-scala

    sight-scala Scala CIcodecov

    Scala client library for Sight APIs. The Sight API is a text recognition service.

    Scala 3.0.0

    Dependency

    libraryDependencies += "io.github.ashwinbhaskar" %% "sight-client" % "0.1.2"
    

    Scala 2.13.4 / 2.13.5

    Dependency

    scalacOptions += "-Ytasty-reader",
    libraryDependencies += "io.github.ashwinbhaskar" % "sight-client_3.0.0-RC3" % "0.1.2"
    

    API Key

    Grap an APIKey from the Sight Dashboard

    Code

    1. One Shot: If your files contain a lot of pages then this will take some time as this will return only after all the pages have been processed. Use the function recognize as shown below.

      import sight.client.SightClient
      import sight.Types.APIKey
      import sight.models.Pages
      import sight.adt.Error
      
      val apiKey: Either[Error, APIKey] = APIKey("xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx")
      val files: Seq[String] = Seq("/user/john.doe/foo.pdf","/user/john.doe/baz/bmp")
      val result: Either[Error, Pages] = apiKey.flatMap(key => SightClient(key).recognize(files))
      
      /*
      Helper extension methods to inspect the reesult
      Note: Extension methods will not work with Scala 2.13.4 and 2.13.5
      */
      import sight.extensions._
      val allTxt: Either[Error, Seq[String]] = result.map(_.allText)
      val allTxtGt: Either[Error, Seq[String]] = result.map(_.allTextWithConfidenceGreaterThan(0.2))
      
    2. Stream: You can choose to get pages as and when they are processed. So this returns a LazySequence which can be consumed as a bunch of pages are processed. Use the function recognizeStream as shown below.

      import sight.Types.APIKey
      import sight.models.Page
      import sight.adt.Error
      import sight.client.SightClient
      
      val apiKey: Either[Error, APIKey] = APIKey("xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx")
      val files = Seq("/user/john.doe/foo.pdf","/user/john.doe/baz/bmp")Downloads/flight-euro.pdf")
      apiKey match
          case Right(k) => 
              val result: LazyList[Either[Error, Seq[Page]]] = SightClient(k).recognizeStream(files)
              result.foreach(println)
          case Left(error) => println(e)
      

    Official API Documentation

    Here is the official API Documentation

    Visit original content creator repository
  • slack-clone

    Getting Started with Create React App

    This project was bootstrapped with Create React App.

    Available Scripts

    In the project directory, you can run:

    npm start

    Runs the app in the development mode.
    Open http://localhost:3000 to view it in your browser.

    The page will reload when you make changes.
    You may also see any lint errors in the console.

    npm test

    Launches the test runner in the interactive watch mode.
    See the section about running tests for more information.

    npm run build

    Builds the app for production to the build folder.
    It correctly bundles React in production mode and optimizes the build for the best performance.

    The build is minified and the filenames include the hashes.
    Your app is ready to be deployed!

    See the section about deployment for more information.

    npm run eject

    Note: this is a one-way operation. Once you eject, you can’t go back!

    If you aren’t satisfied with the build tool and configuration choices, you can eject at any time. This command will remove the single build dependency from your project.

    Instead, it will copy all the configuration files and the transitive dependencies (webpack, Babel, ESLint, etc) right into your project so you have full control over them. All of the commands except eject will still work, but they will point to the copied scripts so you can tweak them. At this point you’re on your own.

    You don’t have to ever use eject. The curated feature set is suitable for small and middle deployments, and you shouldn’t feel obligated to use this feature. However we understand that this tool wouldn’t be useful if you couldn’t customize it when you are ready for it.

    Learn More

    You can learn more in the Create React App documentation.

    To learn React, check out the React documentation.

    Code Splitting

    This section has moved here: https://facebook.github.io/create-react-app/docs/code-splitting

    Analyzing the Bundle Size

    This section has moved here: https://facebook.github.io/create-react-app/docs/analyzing-the-bundle-size

    Making a Progressive Web App

    This section has moved here: https://facebook.github.io/create-react-app/docs/making-a-progressive-web-app

    Advanced Configuration

    This section has moved here: https://facebook.github.io/create-react-app/docs/advanced-configuration

    Deployment

    This section has moved here: https://facebook.github.io/create-react-app/docs/deployment

    npm run build fails to minify

    This section has moved here: https://facebook.github.io/create-react-app/docs/troubleshooting#npm-run-build-fails-to-minify

    Visit original content creator repository