August 27, 2019
Overview
The LoreSoft Blazor Controls project contains a collection of Blazor user controls.
Installing
The LoreSoft.Blazor.Controls library is available on nuget.org via package name LoreSoft.Blazor.Controls
.
To install LoreSoft.Blazor.Controls, run the following command in the Package Manager Console
Install-Package LoreSoft.Blazor.Controls
Setup
To use, you need to include the following CSS and JS files in your index.html
(Blazor WebAssembly) or _Host.cshtml
(Blazor Server).
In the head tag add the following CSS.
<link rel="stylesheet" href="_content/LoreSoft.Blazor.Controls/BlazorControls.css" />
Then add the JS script at the bottom of the page using the following script tag.
<script src="_content/LoreSoft.Blazor.Controls/BlazorControls.js"></script>
Typeahead Features
- Searching data by supplying a search function
- Template for search result, selected value, and footer
- Debounce support for smoother search
- Character limit before searching
- Multiselect support
- Built in form validation support
Typeahead Properties
Required
- Value - Bind to
Value
in single selection mode. Mutually exclusive to Values
property.
- Values - Bind to
Values
in multiple selection mode. Mutually exclusive to Value
property.
- SearchMethod - The method used to return search result
Optional
- AllowClear - Allow the selected value to be cleared
- ConvertMethod - The method used to convert search result type to the value type
- Debounce - Time to wait, in milliseconds, after last key press before starting a search
- Items - The initial list of items to show when there isn’t any search text
- MinimumLength - Minimum number of characters before starting a search
- Placeholder - The placeholder text to show when nothing is selected
Templates
- ResultTemplate - User defined template for displaying a result in the results list
- SelectedTemplate - User defined template for displaying the selected item(s)
- NoRecordsTemplate - Template for when no items are found
- FooterTemplate - Template displayed at the end of the results list
- LoadingTemplate - Template displayed while searching
Typeahead Examples
Basic Example
State selection dropdown. Bind to Value
property for single selection mode.
<Typeahead SearchMethod="@SearchState"
Items="Data.StateList"
@bind-Value="@SelectedState"
Placeholder="State">
<SelectedTemplate Context="state">
@state.Name
</SelectedTemplate>
<ResultTemplate Context="state">
@state.Name
</ResultTemplate>
</Typeahead>
@code {
public StateLocation SelectedState { get; set; }
public Task<IEnumerable<StateLocation>> SearchState(string searchText)
{
var result = Data.StateList
.Where(x => x.Name.ToLower().Contains(searchText.ToLower()))
.ToList();
return Task.FromResult<IEnumerable<StateLocation>>(result);
}
}
Multiselect Example
When you want to be able to select multiple results. Bind to the Values
property. The target property must be type IList<T>
.
<Typeahead SearchMethod="@SearchPeople"
Items="Data.PersonList"
@bind-Values="@SelectedPeople"
Placeholder="Owners">
<SelectedTemplate Context="person">
@person.FullName
</SelectedTemplate>
<ResultTemplate Context="person">
@person.FullName
</ResultTemplate>
</Typeahead>
@code {
public IList<Person> SelectedPeople;
public Task<IEnumerable<Person>> SearchPeople(string searchText)
{
var result = Data.PersonList
.Where(x => x.FullName.ToLower().Contains(searchText.ToLower()))
.ToList();
return Task.FromResult<IEnumerable<Person>>(result);
}
}
GitHub Repository Search
Use Octokit to search for a GitHub repository.
<Typeahead SearchMethod="@SearchGithub"
@bind-Value="@SelectedRepository"
Placeholder="Repository"
MinimumLength="3">
<SelectedTemplate Context="repo">
@repo.FullName
</SelectedTemplate>
<ResultTemplate Context="repo">
<div class="github-repository clearfix">
<div class="github-avatar"><img src="@repo.Owner.AvatarUrl"></div>
<div class="github-meta">
<div class="github-title">@repo.FullName</div>
<div class="github-description">@repo.Description</div>
<div class="github-statistics">
<div class="github-forks"><i class="fa fa-flash"></i> @repo.ForksCount Forks</div>
<div class="github-stargazers"><i class="fa fa-star"></i> @repo.StargazersCount Stars</div>
<div class="github-watchers"><i class="fa fa-eye"></i> @repo.SubscribersCount Watchers</div>
</div>
</div>
</div>
</ResultTemplate>
</Typeahead>
@inject IGitHubClient GitHubClient;
@code {
public Repository SelectedRepository { get; set; }
public async Task<IEnumerable<Repository>> SearchGithub(string searchText)
{
var request = new SearchRepositoriesRequest(searchText);
var result = await GitHubClient.Search.SearchRepo(request);
return result.Items;
}
}
December 10, 2018
Overview
In this tutorial, you’ll learn how to use the Entity Framework Core Generator to create an Entity Framework Core data model for an ASP.NET WebAPI. Entity Framework Core Generator (efg) is .NET Core command-line (CLI) tool to generate Entity Framework Core model from an existing database.
The code for this tutorial is available at https://github.com/pwelter34/EntityFrameworkCore.Generator.Demo
Tracker Database
This tutorial will be a database first model. The database is a simple task tracking database.

The DDL script can be downloaded here. To create the database on your local SQL Server, run the following command.
sqlcmd -S localhost -i Tracker.sql -E
ASP.NET Core Web API Project
To get started, create a new ASP.NET Core Web API project. You can either use Visual Studio or a command line. In this demo, you’ll use the command line
You’ll need a few nuget packages for the mapper and validation classes.
dotnet add package AutoMapper
dotnet add package AutoMapper.Extensions.Microsoft.DependencyInjection
dotnet add package FluentValidation.AspNetCore
dotnet add package Swashbuckle.AspNetCore

Entity Framework Core Generator
To use Entity Framework Core Generator, you need to install the .NET Core Global Tool.
dotnet tool install --global EntityFrameworkCore.Generator
After the tool has been install, the efg
command line will be available. Run efg --help
for command line options
Initialize Configuration
Entity Framework Core Generator has many options available to customize the generated output. The initialize
command is used to create the configuration yaml file and optionally set the database connection string. Update the configuration file to configure the generated output.
The following command will create an initial generation.yaml
configuration file as well as setting a user secret to store the connection string.
efg initialize -c "Data Source=(local);Initial Catalog=Tracker;Integrated Security=True" ^
--id "984ef0cf-2b22-4fd1-876d-e01499da4c1f" ^
--name "ConnectionStrings:Tracker"
The following is the yaml file created by default
project:
namespace: '{Database.Name}'
directory: .\
database:
connectionName: ConnectionStrings:Tracker
userSecretsId: 984ef0cf-2b22-4fd1-876d-e01499da4c1f
data:
context:
name: '{Database.Name}Context'
baseClass: DbContext
namespace: '{Project.Namespace}.Data'
directory: '{Project.Directory}\Data'
entity:
namespace: '{Project.Namespace}.Data.Entities'
directory: '{Project.Directory}\Data\Entities'
mapping:
namespace: '{Project.Namespace}.Data.Mapping'
directory: '{Project.Directory}\Data\Mapping'
query:
generate: true
indexPrefix: By
uniquePrefix: GetBy
namespace: '{Project.Namespace}.Data.Queries'
directory: '{Project.Directory}\Data\Queries'
model:
shared:
namespace: '{Project.Namespace}.Domain.Models'
directory: '{Project.Directory}\Domain\Models'
read:
generate: true
name: '{Entity.Name}ReadModel'
create:
generate: true
name: '{Entity.Name}CreateModel'
update:
generate: true
name: '{Entity.Name}UpdateModel'
mapper:
generate: true
name: '{Entity.Name}Profile'
baseClass: AutoMapper.Profile
namespace: '{Project.Namespace}.Domain.Mapping'
directory: '{Project.Directory}\Domain\Mapping'
validator:
generate: true
name: '{Model.Name}Validator'
baseClass: AbstractValidator<{Model.Name}>
namespace: '{Project.Namespace}.Domain.Validation'
directory: '{Project.Directory}\Domain\Validation'
Generate Entity Framework Core Model
In order to use Entity Framework Core, you need a DbContext, entity classes and mapping from a table to those entities. Entity Framework Core Generator creates these for you from the database. To generate the files, run the generate
command
efg generate

Generation Output
The generate
command will create the follow files and directory structure by default. The root directory defaults to the current working directory. Most of the output names and locations can be customized in the configuration file
Data Context Output
The EntityFramework DbContext file will be created in the Data directory.
Entities Output
The Entities directory will contain the generated source file for entity class representing each table.
Mapping Output
The Mapping directory contains a fluent mapping class to map each entity to its table.
Model Output
Entity Framework Core Generator has an option to create view models for each entity. The output will go in the Domain directory by default.
Generated Model Cleanup
Entity Framework Core Generator supports safe regeneration via region replacement and source code parsing. A typical workflow for a project requires many database changes and updates. Being able to regenerate the entities and associated files is a huge time saver.
Rename Property
The code generator makes its best attempt to convert names to there plural form using the Humanizer
library. In some cases it fails. The first cleanup to do is to rename the TrackerContext.TaskExtendeds property to TrackerContext.TaskExtended.

When the generate
command is re-run, this refactor will be saved.
Identifier Interface
In order to handle entities in a generic way, we’ll need to add some interfaces to them. We’ll add the IHaveIdentifier
to all entities and models.
Interface definition
namespace Tracker.Definitions
{
public interface IHaveIdentifier
{
Guid Id { get; set; }
}
}
Add the interface to all entities that have an Id primary key. Below is an example entity class with the interface added.
public partial class Priority : IHaveIdentifier
{
public Priority()
{
#region Generated Constructor
Tasks = new HashSet<Task>();
#endregion
}
#region Generated Properties
public Guid Id { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public int DisplayOrder { get; set; }
public bool IsActive { get; set; }
public DateTimeOffset Created { get; set; }
public string CreatedBy { get; set; }
public DateTimeOffset Updated { get; set; }
public string UpdatedBy { get; set; }
public Byte[] RowVersion { get; set; }
#endregion
#region Generated Relationships
public virtual ICollection<Task> Tasks { get; set; }
#endregion
}
Notice the file has some regions like #region Generated ...
. These regions are what get replace the next time efg generate
is called. Since the interface was added outside of those regions, it will not get overwritten.
Read more about regeneration in the documentation.
Web API
You’ll need to add a few more things to your Web API project to get things going.
Application Startup
You’ll need to change the application startup to register the Entity Framework context as well as register the AutoMapper profiles.
using AutoMapper;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Swashbuckle.AspNetCore.Swagger;
using Tracker.Data;
namespace Tracker
{
public class Startup
{
public Startup(IConfiguration configuration)
{
Configuration = configuration;
}
public IConfiguration Configuration { get; }
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
// sharing the user secret configuration file
var connectionString = Configuration.GetConnectionString("Tracker");
services.AddDbContext<TrackerContext>(options => options.UseSqlServer(connectionString));
// register AutoMapper profiles
services.AddAutoMapper();
services
.AddMvc()
.AddFluentValidation(fv => fv.RegisterValidatorsFromAssemblyContaining<Startup>()) // register validators
.SetCompatibilityVersion(CompatibilityVersion.Version_2_2);
// Register the Swagger generator
services.AddSwaggerGen(c =>
{
c.SwaggerDoc("v1", new Info { Title = "Tracker API", Version = "v1" });
});
}
// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IHostingEnvironment env)
{
if (env.IsDevelopment())
{
app.UseDeveloperExceptionPage();
}
else
{
app.UseHsts();
}
// Enable middleware to serve generated Swagger as a JSON endpoint.
app.UseSwagger();
// Enable middleware to serve swagger-ui, specifying the Swagger JSON endpoint.
app.UseSwaggerUI(c =>
{
c.SwaggerEndpoint("/swagger/v1/swagger.json", "Tracker API");
});
app.UseHttpsRedirection();
app.UseMvc();
}
}
}
Base Controller
To make the basic read, create and update endpoints easier, create a base controller class like the following.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Linq.Expressions;
using System.Threading;
using System.Threading.Tasks;
using AutoMapper;
using AutoMapper.QueryableExtensions;
using Microsoft.AspNetCore.Mvc;
using Microsoft.EntityFrameworkCore;
using Tracker.Data;
using Tracker.Definitions;
namespace Tracker.Controllers
{
[ApiController]
[Produces("application/json")]
public abstract class EntityControllerBase<TEntity, TReadModel, TCreateModel, TUpdateModel> : ControllerBase
where TEntity : class, IHaveIdentifier
{
protected EntityControllerBase(TrackerContext dataContext, IMapper mapper)
{
DataContext = dataContext;
Mapper = mapper;
}
protected TrackerContext DataContext { get; }
protected IMapper Mapper { get; }
protected virtual async Task<TReadModel> ReadModel(Guid id, CancellationToken cancellationToken = default(CancellationToken))
{
var model = await DataContext
.Set<TEntity>()
.AsNoTracking()
.Where(p => p.Id == id)
.ProjectTo<TReadModel>(Mapper.ConfigurationProvider)
.FirstOrDefaultAsync(cancellationToken);
return model;
}
protected virtual async Task<TReadModel> CreateModel(TCreateModel createModel, CancellationToken cancellationToken = default(CancellationToken))
{
// create new entity from model
var entity = Mapper.Map<TEntity>(createModel);
// add to data set, id should be generated
await DataContext
.Set<TEntity>()
.AddAsync(entity, cancellationToken);
// save to database
await DataContext
.SaveChangesAsync(cancellationToken);
// convert to read model
var readModel = await ReadModel(entity.Id, cancellationToken);
return readModel;
}
protected virtual async Task<TReadModel> UpdateModel(Guid id, TUpdateModel updateModel, CancellationToken cancellationToken = default(CancellationToken))
{
var keyValue = new object[] { id };
// find entity to update by id, not model id
var entity = await DataContext
.Set<TEntity>()
.FindAsync(keyValue, cancellationToken);
if (entity == null)
return default(TReadModel);
// copy updates from model to entity
Mapper.Map(updateModel, entity);
// save updates
await DataContext
.SaveChangesAsync(cancellationToken);
// return read model
var readModel = await ReadModel(id, cancellationToken);
return readModel;
}
protected virtual async Task<TReadModel> DeleteModel(Guid id, CancellationToken cancellationToken = default(CancellationToken))
{
var dbSet = DataContext
.Set<TEntity>();
var keyValue = new object[] { id };
// find entity to delete by id
var entity = await dbSet
.FindAsync(keyValue, cancellationToken);
if (entity == null)
return default(TReadModel);
// return read model
var readModel = await ReadModel(id, cancellationToken);
// delete entry
dbSet.Remove(entity);
// save
await DataContext
.SaveChangesAsync(cancellationToken);
return readModel;
}
protected virtual async Task<IReadOnlyList<TReadModel>> QueryModel(Expression<Func<TEntity, bool>> predicate = null, CancellationToken cancellationToken = default(CancellationToken))
{
var dbSet = DataContext
.Set<TEntity>();
var query = dbSet.AsNoTracking();
if (predicate != null)
query = query.Where(predicate);
var results = await query
.ProjectTo<TReadModel>(Mapper.ConfigurationProvider)
.ToListAsync(cancellationToken);
return results;
}
}
}
Task Controller
Create a TaskController to Create, Update, Read and Delete tasks.
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using AutoMapper;
using Microsoft.AspNetCore.Mvc;
using Tracker.Data;
using Tracker.Domain.Models;
namespace Tracker.Controllers
{
[Route("api/Task")]
public class TaskController : EntityControllerBase<Data.Entities.Task, TaskReadModel, TaskCreateModel, TaskUpdateModel>
{
public TaskController(TrackerContext dataContext, IMapper mapper) : base(dataContext, mapper)
{
}
[HttpGet("{id}")]
public async Task<ActionResult<TaskReadModel>> Get(CancellationToken cancellationToken, Guid id)
{
var readModel = await ReadModel(id, cancellationToken);
if (readModel == null)
return NotFound();
return readModel;
}
[HttpGet("")]
public async Task<ActionResult<IReadOnlyList<TaskReadModel>>> List(CancellationToken cancellationToken)
{
var listResult = await QueryModel(null, cancellationToken);
return Ok(listResult);
}
[HttpPost("")]
public async Task<ActionResult<TaskReadModel>> Create(CancellationToken cancellationToken, TaskCreateModel createModel)
{
var readModel = await CreateModel(createModel, cancellationToken);
return readModel;
}
[HttpPut("{id}")]
public async Task<ActionResult<TaskReadModel>> Update(CancellationToken cancellationToken, Guid id, TaskUpdateModel updateModel)
{
var readModel = await UpdateModel(id, updateModel, cancellationToken);
if (readModel == null)
return NotFound();
return readModel;
}
[HttpDelete("{id}")]
public async Task<ActionResult<TaskReadModel>> Delete(CancellationToken cancellationToken, Guid id)
{
var readModel = await DeleteModel(id, cancellationToken);
if (readModel == null)
return NotFound();
return readModel;
}
}
}
Test Endpoint
You can use the Swagger UI to test the end points


Database Change / Regenerate
Now that you have the basic Web API project setup, you can run efg generate
after any database change to keep all your entity and view models in sync with the database.
November 14, 2018
Overview
.NET Core command-line (CLI) tool to generate Entity Framework Core model from an existing database.
Features
- Entity Framework Core database first model generation
- Safe regeneration via region replacement
- Safe Renaming via mapping file parsing
- Optionally generate read, create and update models from entity
- Optionally generate validation and object mapper classes
Documentation
Entity Framework Core Generator documentation is available via Read the Docs
Installation
To install EntityFrameworkCore.Generator tool, run the following command in the console
dotnet tool install --global EntityFrameworkCore.Generator
After the tool has been install, the efg
command line will be available. Run efg --help
for command line options
Generate Command
Entity Framework Core Generator (efg) creates source code files from a database schema. To generate the files with no configuration, run the following
efg generate -c <ConnectionString>
Replace <ConnectionString>
with a valid database connection string.
Generation Output
The generate
command will create the follow files and directory structure by default. The root directory defaults to the current working directory. Most of the output names and locations can be customized in the configuration file
Data Context Output
The EntityFramework DbContext file will be created in the root directory.
Entities Output
The entities directory will contain the generated source file for entity class representing each table.
Mapping Output
The mapping directory contains a fluent mapping class to map each entity to its table.
Initialize Command
The initialize
command is used to create the configuration yaml file and optionally set the connection string. The configuration file has many options to configure the generated output. See the configuration file documentation for more details.
The following command will create an initial generation.yaml
configuration file as well as setting a user secret to store the connection string.
efg initialize -c <ConnectionString>
When a generation.yaml
configuration file exists, you can run efg generate
in the same directory to generate the source using that configuration file.
Regeneration
Entity Framework Core Generator supports safe regeneration via region replacement and source code parsing. A typical workflow for a project requires many database changes and updates. Being able to regenerate the entities and associated files is a huge time saver.
Region Replacement
All the templates output a region on first generation. On future regeneration, only the regions are replaced. This keeps any other changes you’ve made to the source file.
Example of a generated entity class
public partial class Status
{
public Status()
{
#region Generated Constructor
Tasks = new HashSet<Task>();
#endregion
}
#region Generated Properties
public int Id { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public int DisplayOrder { get; set; }
public bool IsActive { get; set; }
public DateTimeOffset Created { get; set; }
public string CreatedBy { get; set; }
public DateTimeOffset Updated { get; set; }
public string UpdatedBy { get; set; }
public Byte[] RowVersion { get; set; }
#endregion
#region Generated Relationships
public virtual ICollection<Task> Tasks { get; set; }
#endregion
}
When the generate
command is re-run, the Generated Constructor
, Generated Properties
and Generated Relationships
regions will be replace with the current output of the template. Any other changes outside those regions will be safe.
Source Parsing
In order to capture and preserve Entity, Property and DbContext renames, the generate
command parses any existing mapping and DbContext class to capture how things are named. This allows you to use the full extend of Visual Studio’s refactoring tools to rename things as you like. Then, when regenerating, those changes won’t be lost.
Database Providers
Entity Framework Core Generator supports the following databases.
- SQL Server
- PostgreSQL Coming Soon
- MySQL Coming Soon
- Sqlite Coming Soon
The provider can be set via command line or via the configuration file.
Set via command line
efg generate -c <ConnectionString> -p <Provider>
Set in configuration file
database:
connectionString: 'Data Source=(local);Initial Catalog=Tracker;Integrated Security=True'
provider: SqlServer
Database Schema
The database schema is loaded from the metadata model factory implementation of IDatabaseModelFactory
. Entity Framework Core Generator uses the implemented interface from each of the supported providers similar to how ef dbcontext scaffold
works.
View Models
Entity Framework Core Generator supports generating Read, Create and Update view models from an entity. Many projects rely on view models to shape data. The model templates can be used to quickly get the basic view models created. The model templates also support regeneration so any database change can easily be sync’d to the view models.
November 28, 2016
Overview
Library to compare two entity object graphs detecting changes
Features
- Compare complete entity graph including child entities, collections and dictionaries
- Collection compare by index or element equality
- Dictionary compare by key
- Custom value string formatter
- Custom entity equality compare
- Markdown or Html change report formatter
Download
The EntityChange library is available on nuget.org via package name EntityChange
.
To install EntityChange, run the following command in the Package Manager Console
PM> Install-Package EntityChange

Configuration
Configure the Contact properties and collections.
EntityChange.Configuration.Default.Configure(config => config
.Entity<Contact>(e =>
{
// set the FirstName display name
e.Property(p => p.FirstName).Display("First Name");
// compare the Roles collection by string equality
e.Collection(p => p.Roles)
.CollectionComparison(CollectionComparison.ObjectEquality)
.ElementEquality(StringEquality.OrdinalIgnoreCase);
// set how to format the EmailAddress entity as a string
e.Collection(p => p.EmailAddresses).ElementFormatter(v =>
{
var address = v as EmailAddress;
return address?.Address;
});
})
.Entity<EmailAddress>(e =>
{
e.Property(p => p.Address).Display("Email Address");
})
);
Comparison
Compare to Contact entities
// create comparer using default configuration
var comparer = new EntityComparer();
// compare original and current instances generating change list
var changes = comparer.Compare(original, current).ToList();
Change Report
Sample output from the MarkdownFormatter
OUTPUT
- Removed
Administrator
from Roles
- Changed
Email Address
from user@Personal.com
to user@gmail.com
- Added
user@home.com
to Email Addresses
- Changed
Status
from New
to Verified
- Changed
Updated
from 5/17/2016 8:51:59 PM
to 5/17/2016 8:52:00 PM
- Changed
Zip
from 10026
to 10027
- Changed
Number
from 888-555-1212
to 800-555-1212
- Added
Blah
to Categories
- Changed
Data
from 1
to 2
- Changed
Data
from ./home
to ./path
April 10, 2016
Overview
The MongoDB Messaging library is a lightweight queue pub/sub processing library based on MongoDB data store.
Features
- Easy to use Fluent API
- Self creating and cleaning of Queues
- Configurable message expiration
- Generic data payload
- Trigger processing from oplog change monitoring
- Configurable auto retry on error
- Message processing timeout
- Scalable via subscriber worker count
- Supports distributed locks
Download
The MongoDB.Messaging library is available on nuget.org via package name MongoDB.Messaging
.
To install MongoDB.Messaging, run the following command in the Package Manager Console
PM> Install-Package MongoDB.Messaging

Concepts
Queue
A queue is equivalent to a MongoDB collection. The name of the queue will match the MongoDB collection name.
Queue names must be alphanumeric, without spaces or symbols.
It is a good practice to suffix the queue name with Queue
.
Message
A message is the high level object that is a generic definition of a messages. The message contains processing level information. The message object is automatically created and updated by the Fluent API and should not be updated directly by the publisher or subscriber.
Data
Data is the message payload to be processed with the message. Use data to pass information you need to process the message.
The data object must be serializable by MongoDB driver.
It is a good practice to have one queue per data object being passed to limit confusion and to maintain simplicity when subscribing to the queue.
Publish
Publishing a message adds the message with the corresponding data to a queue for processing.
Subscribe
In order to process a message on a queue, an application needs to subscribe to a queue. There can be many subscribers to a queue to scale the load across processes. A subscriber can also set the worker count to scale the number of processing threads for that subscriber.
The framework ensures that only one subscriber can process a messages.
Queue Configuration
The queue configuration is used to set default values on messages published to a queue.
An example of using the fluent api to configure the sleep queue.
MessageQueue.Default.Configure(c => c
.Connection("MongoMessaging")
.Queue(s => s
.Name(SleepMessage.QueueName)
.Priority(MessagePriority.Normal)
.ResponseQueue("ReplyQueueName")
.Retry(5)
)
);
Properties
Connection is the app.config connection string name used to connect to MongoDB.
Name is the name of the queue to configure.
Retry is the number of times the message should be retried on error. Set to zero, default, to not retry.
Priority is the default priority to publish the message with.
ResponseQueue is the name of the queue where responses should be sent.
Publish Message
To publish a message to a queue, use the fluent api.
var message = await MessageQueue.Default.Publish(m => m
.Queue(SleepMessage.QueueName)
.Data(sleepMessage)
.Correlation("321B4671-3B4C-4B97-8E81-D6A8CF22D4F0")
.Description("User friendly description of the message")
.Priority(MessagePriority.Normal)
.Retry(1)
);
Properties
Required
Queue is the name of the queue to publish to.
Data is the object to pass in the message. Used to process the message by the subscriber.
Optional
Correlation is an identifier used to link messages together.
Description is a user friendly description of the message.
Overrides
Retry is the number of times the message should be retried on error.
Priority is the default priority to publish the message with.
ResponseQueue is the name of the queue where responses should be sent.
Notes
- When setting the Data property, the message Name will be set to the Type name of the data object.
- When setting the Data property and Description hasn’t been set, the data object
ToString()
value will be set as the description.
- If the underlying storage collection doesn’t exist, it will be created on first publish
Subscribe to Message
To subscribe to a queue, use the fluent api. The subscribe handler must implement IMessageSubscriber
.
MessageQueue.Default.Configure(c => c
.Connection("MongoMessaging")
.Subscribe(s => s
.Queue(SleepMessage.QueueName)
.Handler<SleepHandler>()
.Workers(4)
)
);
To speed up processing, you can monitor the oplog for changes to trigger processing. The connection must have access to local.oplog.rs
MessageQueue.Default.Configure(c => c
.Connection("MongoMessaging")
.Subscribe(s => s
.Queue(SleepMessage.QueueName)
.Handler<SleepHandler>()
.Workers(4)
.Trigger()
)
);
Properties
Required
Queue is the name of the queue to publish to.
Handler is the class that implements IMessageSubscriber. This is what processes the message.
Optional
Workers is the number of worker processes.
ExpireError is how long to keep error messages.
ExpireWarning is how long to keep warning messages.
ExpireSuccessful is how long to keep successful messages.
PollTime is the amount of time between work polling. If using Triggger, set to longer time.
Retry is a class that implements IMessageRetry. IMessageRetry controls if an error message should be retried.
Timeout is the amount of time before a processing message times out.
TimeoutAction is how to handle timed out messages. Options are Fail or Retry.
Trigger to enable monitoring of the oplog for changes to trigger processing.
Message Service
In order for the message subscribers to process messages off queue, the MessageService
needs to be created and Start
called. Note, the MessageService.Stop()
method tries to gracefully stop by waiting for active processes to finish.
_messageService = new MessageService();
// on service or application start
_messageService.Start();
// on service stop.
_messageService.Stop();
IMessageSubscriber Interface
The following is a sample implementation of IMessageSubscriber
public class SleepHandler : IMessageSubscriber
{
public MessageResult Process(ProcessContext context)
{
// get message data
var sleepMessage = context.Data<SleepMessage>();
// Do processing here
return MessageResult.Successful;
}
public void Dispose()
{
// free resources
}
}
IMessageRetry Interface
The IMessageRetry
interface allows for customization of the retry of failed messages.
The following is the default implementation of IMessageRetry
public class MessageRetry : IMessageRetry
{
public virtual bool ShouldRetry(ProcessContext processContext, Exception exception)
{
// get current message
var message = processContext.Message;
// true to retry message
return message.ErrorCount < message.RetryCount;
}
public virtual DateTime NextAttempt(ProcessContext processContext)
{
var message = processContext.Message;
// retry weight, 1 = 1 min, 2 = 30 min, 3 = 2 hrs, 4+ = 8 hrs
if (message.ErrorCount > 3)
return DateTime.Now.AddHours(8);
if (message.ErrorCount == 3)
return DateTime.Now.AddHours(2);
if (message.ErrorCount == 2)
return DateTime.Now.AddMinutes(30);
// default
return DateTime.Now.AddMinutes(1);
}
}
Process Locks
The library has supports distributed locks. The following are the lock types supported.
DistributedLock Distributed Lock manager provides synchronized access to a resources over a network
ThrottleLock Throttle Lock Manager controls how frequent a process can run
This is an example of using the DistributedLock.
var lockName = "PrintMessage";
// get MongoDB collection to store lock
var collection = GetCollection();
// create lock with timeout, max time it will wait for lock, of 5 minutes
var locker = new DistributedLock(collection, TimeSpan.FromMinutes(5));
// acquire lock; if can't, it will retry to get lock up to timeout value
var result = locker.Acquire(lockName);
if (!result)
return; // acquire lock timeout
try
{
// do processing here
}
finally
{
// release lock
locker.Release(lockName);
}