PrePAN

Sign in to PrePAN

PrePAN provides a place
to discuss your modules.

CLOSE

Requests for Reviews Feed

DBIx::Class::Factory Factory-style fixtures for DBIx::Class

I try to create factory_girl/factory_boy analogue for Perl.

I would be happy to name it Test::DBIx::Class::Factory (since my module is mostly intended for tests) but this name is occupied.

This is my first destribution and I messed up a little with version numbers, don't look at that :).

What should I improve?

Thanks!

Also see project on github.

VadimPushtaev@github 0 comments

Data::Transit Implementation of the transit format in perl

Transit is a format layered on top of JSON and MessagePack which encodes the type into the data being sent. This library provides support for converting to/from Transit in Perl. If you want information on the format in general, you can get that at https://github.com/cognitect/transit-format.

lackita@github 1 comment

MooX::Cmd::ChainedOptions Easy access to options up the chain of commands

When used instead of MooX::Options in applications constructed with MooX::Cmd, this class adds attributes to a command object which provide access to command line options passed to commands higher up the command chain.

For example (using the above code which creates an application and a command), the following command line

app --app-opt=FOO cmd --cmd-opt=BAR

would result in the MyApp::Cmd::cmd object having an attribute called app_opt which contains FOO.

Without this module, the MyApp::Cmd::cmd object would have to search through the chain of commands (passed to the execute method, or available via the command_chainmethod)looking for the app_opt attribute.

I'm afraid the name may be confused with "chained" method approaches, but MooX::Cmd uses the "chain" terminology to describe the hierarchy of commands, so I thought I'd stick to that.

Any alternate suggestions?

Thanks, Diab

djerius@github 0 comments

Net::Amazon::IAM Perl interface to the Amazon Identity and Access Management.

Looked for same module and didn't found any.

I was needed it in order to accomplish some tasks for my job.

I used php temporary since there was nothing similar in perl.

It still not fully done, not all methods allowed by API were implemented, but basic functionality is here, tested and found working.

By next steps I'm going to implement some tests and add other functionality available in API.

tsiganenok@github 1 comment

WWW::Mech restructuration of the WWW::Mechanize ecosystem

I use the WWW::Mechanize::* and Test::Mechanize::* modules quite a lot, but they have the problem that they all pretty do their own things and don't aggregate cleanly. For example, I recently wanted to use Test::WWW::Mechanize::JSON with Test::WWW::Mechanize::PSGI. I ended monkey-patching the wanted methods of the former in the latter. Worked fine, but feels icky.

So what I would like to try to come up with a modular design for all things WWW::Mechanize. I'm thinking of the following:

Split the functionality of WWW::Mechanize between two main area: the agent, which job is solely to take the requests and give back responses, and the plugins, which do things to those responses/requests.

So the creation of a $mech object will be along the lines of

my $mech = WWW::Mech->new(
    agent => 'WWW::Mech::Agent::PSGI',
    plugins => [ 
        'WWW::Mech::Plugin::Test',  # brings in get_ok, content_is, etc
        'WWW::Mech::Plugin::JSON',  #  brings in json_ok and friends
    ],
);

$mech->get_ok( 'http://localhost' );
$mech->json_ok;

In the same vein, I'd also make it possible to have a $mech singleton so that, for quick tests, one can do away with the $mech-> bit

WWW::Mech->singleton(
    agent => 'WWW::Mech::Agent::PSGI',
    plugins => [ 
        'WWW::Mech::Plugin::Test',  # brings in get_ok, content_is, etc
        'WWW::Mech::Plugin::JSON',  #  brings in json_ok and friends
    ],
);

get_ok 'http://localhost';
json_ok;

So... sounds something worth exploring?

yanick@github 2 comments

IMDB::Local imdb database creation and access tools

As part of xmltv (https://sourceforge.net/projects/xmltv/) there is a tool I wrote years ago called tv_imdb, it downloads the imdb list files (http://www.imdb.com/interfaces) and creates a local database which can then be used to augment xmltv guide data with imdb movie, actors, plots, etc.

As part of a major rewrite I've decided to use a sqlite database rather than a bunch of flat files. Among other things this will improve performance, flexibility, searching and re-use.

The perl package would consist of several modules:

IMDB::Local - provide interface to create a sqlite database and load the imdb list files

IMDB::Local::DBI - methods to interact with the database

IMDB::Local::Title - OO query interface

I'd also include a couple of command line tools to aid in downloading the imdb list files and sample utilities to create and query the local database.

How does this look ? Can you suggest a better name ?

thanks jerry

JerryVeldhuis@github 0 comments

Algorithm::Statistic

Hi all. I'm going to create a big XS module which allows to calculate various statistics. In future versions some graph algorithms will be added, for instance.

All of these algorithms are used at my work. It's hard work to put it in order and to write some documentation but i hope it would be useful to the community.

In first version k-th order statistic and mediana are available.

Of course some of algorithms exist on CPAN, but I'm going to add only those which doesn't exist in XS implementation or my version will be more efficient.

Please review my module. It's really important for me to see cpan experts opinion.

karbachinsky@github 4 comments

DBIx::Class::InflateColumn::Serializer::Role::HashContentAccessor Parameterized Moose role which provides accessor methods for values stored in a hash-like structure

This parameterized role provides methods to access values of a column that stores 'hash-like' data in a DBIx::Class-based database schema that uses Moose. It assumes that the (de)serializing of the column in done using something like DBIx::Class::InflateColumn::Serializer, i.e. that the inflated values are hash references that get serialized to something the database can store on INSERT/UPDATE.

This module provides the following advantages over using the inflated hashref directly:

  • If the column is nullable, you don't have to take care of NULL values yourself - the methods provided by this role do that already.
  • It's easy to provide a default when getting the value of a key which is not necessarily already stored in the column data.
  • If you remove the key-value-based column and replace it with dedicated columns in the future, you can simply remove the role and provide compatible accessors yourself. This allows you to keep the interface of the result class.

mstock@github 0 comments

WebService::MODIS Downloading MODIS satellite data

Several preprocessed satellite data is available from the Moderate Resolution Imaging Spectroradiometer (MODIS). The following module provides an interface to download these satellite products stored in HDF format. The data can for example be further processed by the MODIS reprojection tool (MRT: https://lpdaac.usgs.gov/tools/modis_reprojection_tool) or other software.

The module creates a download list for various MODIS products (e.g. https://lpdaac.usgs.gov/) and downloads them. Or the list can be written to a file and passed to wget for downloading. Continued download is supported through LWP::UserAgent.

First the server side directory structure is read and can be stored in local configuration files, which will speedup repeated usage substantially. However, this caching routines are the part, which still has a "unperlish" API. I did not yet find a solution how to create an "cache" object, which is part of all WebService::MODIS objects.

joergsteinkamp@github 3 comments

DBIx::Schema::Changelog Continuous Database Migration

A package which allows a continuous development with an application that hold the appropriate database system synchronously.

MOTIVATION

When working with several people on a large project that is bound to a database. If you there and back the databases have different levels of development.

You can keep in sync with SQL statements, but these are then incompatible with other database systems.

mziescha@github 2 comments